Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Live action navigation method and live action navigation device

A navigation method and real-scene technology, applied to road network navigators and other directions, can solve the problems of large data volume of street view images and difficulty in choosing the direction of action for users, and achieve the effect of reducing the amount of data transmission

Inactive Publication Date: 2012-12-19
BEIJING BAIDU NETCOM SCI & TECH CO LTD
View PDF12 Cites 42 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the street view image is pre-shot. Due to the difference in the angle of the user, the street view is different from the actual scene where the user is. It is necessary for the user to locate the location of the street view corresponding to the scene where the user is located. It is also difficult for the user to choose the direction of action. , and the amount of street view image data received by the mobile terminal is too large

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Live action navigation method and live action navigation device
  • Live action navigation method and live action navigation device
  • Live action navigation method and live action navigation device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0059] figure 2 The flow chart of the real-scene navigation method provided by Embodiment 1 of the present invention, such as figure 2 As shown, the method mainly includes the following steps:

[0060] Step 201: The system control center receives a real-scene navigation request from the user interface control system, and the real-scene navigation request includes the information of destination A.

[0061] The user can request a navigation route from the current location to the destination A through the user interface control system, that is, the user interface control system sends information including the destination A to the system control center.

[0062] Step 202: the system control center obtains the current geographic location B of the user from the user positioning system.

[0063] The user positioning system determines the current geographic location of the user based on satellite positioning systems (GPS signals, etc.), wireless communication networks, WiFi networ...

Embodiment 2

[0092] Figure 11 The structural diagram of the real-scene navigation device provided for Embodiment 2 of the present invention, such as Figure 11 As shown, the device may include: a real scene image acquisition unit 1101 , a network image acquisition unit 1102 , a matching image acquisition unit 1103 and an action direction determination unit 1104 .

[0093] The real-scene image acquisition unit 1101 is configured to acquire the real-scene image I of the scene where the user is currently located from the user's image acquisition device. The real-scene image I is the image information of the real scene where the user is located, and the collected image may be a single-frame image or a video image.

[0094] The network image acquisition unit 1102 is configured to acquire images corresponding to geographical locations within a preset range from the user's current geographic location B from the network image database to form an image data set U. The network image database cont...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a live action navigation method and a live action navigation device. The live action navigation method comprises the following steps of: obtaining a real image I of the current scene of a user from a user image collection device; obtaining an image, which corresponds to a geographic position at a distance from the current geographic position of the user in a predetermined range, from a network image database so as to form an image data set U; transmitting the real image I and the image data set U to an image matching system, and obtaining an image U0 in the image data set U, which has highest matching degree to the real image I, from the image matching system; determining orientation information of the image U0 from the network image database, and determining a route of the user by utilizing the orientation information, a geographic position B and an action path from the current geographic position B of the user to a destination A. According to the invention, the route to the destination can be conveniently selected by the user, and the data quantity received by a mobile terminal is reduced.

Description

【Technical field】 [0001] The invention relates to the field of computer application technology, in particular to a real-scene navigation method and a real-scene navigation device. 【Background technique】 [0002] With the development of electronic map navigation system, it has brought great convenience to people's travel. It usually uses satellite positioning signals, wireless communication network or WIFI network to locate the user's current geographical location, and uses dynamic programming to plan the user's current location. path of action. [0003] At present, most of them are based on two-dimensional (2D) top view logical maps, but in practical applications, when the user reaches a fork in the road, the navigation map can only give the planned route, but cannot indicate the specific intersection currently selected by the user in real time , because there is a big gap between the 2D top view logic map and the real environment where the user is in, the user still needs ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/26
Inventor 韩钧宇邓超吕赛
Owner BEIJING BAIDU NETCOM SCI & TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products