Indoor passive navigation and positioning system and indoor passive navigation and positioning method

A technology for navigation and positioning and positioning points, which is applied in the field of image processing technology and robotics, and can solve problems such as environmental changes.

Active Publication Date: 2014-12-24
SHANGHAI UNIV
View PDF7 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If we have a three-dimensional map inside the disaster, it is feasible to perform map matching navigation. However, in mos

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor passive navigation and positioning system and indoor passive navigation and positioning method
  • Indoor passive navigation and positioning system and indoor passive navigation and positioning method
  • Indoor passive navigation and positioning system and indoor passive navigation and positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] see figure 1 , figure 2 , the native indoor passive navigation and positioning system, including depth camera Kinect (1) and xsens inertial navigation device (2), characterized in that: the robot (5) is in an indoor environment, depth camera Kinect (1) and xsens inertial navigation device ( 2) It is integrated inside the robot (5), and connected to the industrial computer SBC84823 (2) inside the robot (5) through two USB serial ports. The industrial computer (2) processes the image data returned by the depth camera Kinect (1) and the xsens inertial navigation device (2), the depth data and the attitude angle data of the robot (5). Such as image 3 As shown, the identifier (4) is just a special digital code, which contains the location information of the identifier, and is collected by Kinect (1) as image data; the code contains two parts, the matching area and the digital area: the matching area contains There are three matching templates composed of salient feature...

Embodiment 2

[0048] see Figure 1 to Figure 4 , the indoor passive navigation positioning method uses the above system for positioning, and the specific operation steps are as follows:

[0049] Step 1: The robot runs indoors, and uses Kinect (1) to collect image data and depth data inside the building, and transmits the data to the industrial computer (2);

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an indoor passive navigation and positioning system and an indoor passive navigation and positioning method. The system comprises a depth camera Kinect, an inertial navigator and an identifier, wherein the Kinect and the inertial navigator are connected with an industrial personal computer SBC84823; the identifier is independently stuck on a wall; the Kinect obtains image data and depth data of the identifier; the inertial navigator is used for transmitting the depth data and the image data into the industrial personal computer SBC84823 through a USB interface after determining a self attitude angle; and the coordinate position of a robot (Kinect) is calculated by the industrial personal computer SBC84823. The method comprises the following steps: firstly, identifying a world coordinate of the identifier in an image according to a digital identification technology and calculating a relative coordinate relative to the robot (Kinect) of the identifier; and then, forming a transformation matrix by the self attitude angle of the robot determined by the inertial navigator and a coordinate to be solved of the robot to obtain a transformation equation; and solving the self coordinate of the robot according to the equation. The embodiment of the invention is mainly used for detecting and calculating the coordinate position of the robot in the image.

Description

technical field [0001] The invention discloses an indoor passive navigation and positioning system and method, and relates to the fields of depth image data, image line recognition technology, image processing technology and robot engineering. Background technique [0002] When geological disasters occur, such as earthquakes or radioactive material leakage, the indoor environment may be unsafe. At this time, manual rescue operations will be very dangerous. When this special situation occurs, robots are used for survey and rescue maintenance. , will become an irreplaceable trend. However, the indoor environment makes many common robot rescue measures impossible to implement, such as the most important mobile robot navigation. The indoor environment cannot accept GPS signals, and obstacles such as walls also hinder the transmission of various wireless signals. Indoor navigation itself is one of the difficulties in modern robot research. Especially in an environment where rad...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G01C21/00G01C21/16
CPCG01C21/00G01C21/16
Inventor 蒲华燕张娟顾建军罗均谢少荣马捷颜春明瞿栋
Owner SHANGHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products