Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Navigation method based on urban real scene

A navigation method and real scene technology, which is applied in the field of real scene navigation to obtain real scene images according to people's visual habits, can solve the problems of large amount of model data and poor transmission effect, so as to achieve improved visual effects, good accuracy, and omission of 3D modeling. effect of the process

Pending Publication Date: 2020-06-02
星际空间(天津)科技发展有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

On the premise of achieving the same effect as the 3D electronic map, it solves the problems of large amount of model data and poor transmission effect in the 3D electronic map

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Navigation method based on urban real scene
  • Navigation method based on urban real scene
  • Navigation method based on urban real scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0115] like figure 1 Shown, a kind of navigation method based on city scene, this method comprises the following steps:

[0116] Data acquisition: image acquisition of real scene image data through acquisition equipment; data acquisition of GNSS data through GPS positioning equipment;

[0117] Data processing: Call two adjacent images in the collected real-scene images; extract feature points from the two real-scene images, and perform initial matching of feature points to obtain the initial matching point set, through the spatial geometric relationship between feature points Constraints, and filter the matching points according to the constraints, obtain a new set of matching points, and calculate the transformation matrix; according to the obtained transformation matrix, perform perspective transformation on the acquired second image; use the weighted average method to mosaic the two images, and obtain Wide viewing angle images; read the time stamps in the real scene image ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention, which relates to the field of real scene navigation application, discloses a navigation method based on an urban real scene. The method comprises steps of data collection, data processing, data storage, and data invoking. The method has the advantages that live-action navigation data can be acquired by a common camera, so that the purpose of simplicity and quickness is achieved, anda time-consuming and labor-consuming three-dimensional modeling process in three-dimensional navigation is omitted; the intentional destination is searched in the live-action navigation system, so that the function of browsing in advance can be realized, and the on-site investigation is replaced; the real scene of the corresponding place can be visually seen, so that accurate and rapid positioning is facilitated, the intuition and the accuracy are good, and a brand-new map reading mode is created for the real scene. Through combination of live action and navigation, a user can be assisted inknowing the position of the user, thereby distinguishing a marked building in advance and determining an advancing path; and the user looks around the world from a parallel view angle and the mode iscloser to the perception habit of people.

Description

technical field [0001] The present invention relates to the application field of real-scene navigation, in particular to a real-scene navigation that obtains real-scene images according to people's visual habits, improves the visual effect, creates a new visual experience, can cover all attribute information, and ensures the user's detailed and intuitive understanding sexual needs. That is, the real scene map is completely consistent with the real scene, providing users with a navigation method based on the city real scene with more detailed information and more realistic and accurate map services. Background technique [0002] According to the survey and statistics, the navigation electronic map in our country is mainly two-dimensional map at present. Two-dimensional navigation maps are popular with the public, but there are still many shortcomings: on the one hand, the traditional two-dimensional map is a line drawing, and attribute information needs to be added to it, bu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20G01C21/34G01S19/14
CPCG01C21/20G01C21/34G01C21/343G01C21/3602G01C21/3623G01S19/14Y02A30/60
Inventor 翟婧徐亚丽鞠辰许凡
Owner 星际空间(天津)科技发展有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products