Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Fast Robot Vision Positioning Method Based on Dead Reckoning

A technology of robot vision and positioning method, applied in the field of visual positioning of mobile robots, can solve the problems of low ultrasonic propagation speed, insufficient positioning accuracy to achieve positioning, affecting positioning accuracy, etc., to shorten image processing time and reduce image processing time. , to ensure the effect of positioning accuracy

Active Publication Date: 2022-03-29
HANGZHOU DIANZI UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] For outdoor positioning requirements, the global positioning system (GPS) can basically meet the needs, but in the indoor environment, due to many restrictions, the positioning accuracy is not enough to achieve positioning
Inertial navigation refers to the method of using inertial sensors such as gyroscopes, accelerometers, and electromagnetic compass to realize the positioning and navigation of mobile robots. However, accelerometers and gyroscopes are sensitive to drift, and electromagnetic compass depends on magnetic fields. The existence of these problems affects positioning accuracy
Indoor positioning is realized by using wireless technologies such as infrared, ultrasonic and radio frequency identification. The ultrasonic propagation speed is low, the measurement accuracy is high, and it is not sensitive to external light and magnetic fields. It is widely used, but it is susceptible to multipath effects, non-line-of-sight propagation and temperature. impact of change
Infrared has the characteristics of low cost and simple structure, but it is susceptible to external interference and affects positioning accuracy
Moreover, a large number of hardware devices need to be set up to assist positioning, which is not conducive to maintenance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Fast Robot Vision Positioning Method Based on Dead Reckoning
  • A Fast Robot Vision Positioning Method Based on Dead Reckoning
  • A Fast Robot Vision Positioning Method Based on Dead Reckoning

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach

[0035] The specific implementation of the whole process of the present invention is illustrated below by way of example:

[0036] Step 1. Install the camera on the R-shaped bracket, fix it at a certain height, and ensure that the camera mirror is level with the positioning area, then connect the power supply and network cable, start shooting, and use the upper computer software on the desktop computer to obtain the original color image of the positioning area. Such as figure 1 shown.

[0037] Step 2: Adjust the lens aperture and focus well and keep it unchanged, place the calibration board in the positioning area, then continuously change the calibration board pose in front of the lens, collect multiple images, obtain the calibration board images in different poses and save.

[0038] Step 3: After the acquisition is completed, use MATLAB to process the image of the calibration board, perform camera calibration, and obtain camera parameters.

[0039]Step 4. Under normal circ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fast robot visual positioning method based on track estimation. The method adopts a grid processing method when processing image features, obtains the position coordinates of the target robot in the image, and then transforms the image by using coordinate conversion. The coordinates are converted into actual coordinates, and finally, for the mobile robot in motion, data transmission is performed through the wireless communication module, and the dynamic positioning of the mobile robot is completed by using the angular velocity and angle of the mobile robot; the present invention reduces the amount of image processing data, and in While ensuring the positioning accuracy, the image processing time required for target recognition is reduced; similarly, the processing method in the motion state also plays an important role in shortening the image processing time.

Description

technical field [0001] The invention belongs to the field of positioning of mobile robots, and in particular relates to a visual positioning method of a mobile robot. Background technique [0002] With the development of science and technology and the progress of society, as a strategic emerging industry, robots have been widely used in military, civil and other aspects. People's demand for robots has greatly improved, which also makes robots have a large market space. with foreground. How to enable robots to quickly and efficiently achieve given target tasks in unknown environmental situations has become a major challenge in the development and research of robots. At the same time, as a typical representative of a robot, whether it is feedback control or effectively avoiding various static and dynamic obstacles, it is necessary to know its current position accurately, which is the positioning problem of a mobile robot. [0003] The global positioning system (GPS) can basi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/80G06T7/70G06T7/207G01C21/00G01C11/04
CPCG06T7/80G06T7/70G06T7/207G01C21/005G01C11/04G06T2207/10004G06T2207/20021G06T2207/30244
Inventor 柏建军耿新尚文武邹洪波陈云
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products