Autonomous positioning method for robot

An autonomous positioning and robot technology, which is applied in the directions of instruments, image analysis, image enhancement, etc., can solve problems affecting the positioning accuracy of robots, and achieve the effects of improving image alignment accuracy, enhancing constraint relationships, and improving accuracy

Active Publication Date: 2021-06-01
CHONGQING UNIV
View PDF15 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, in order to solve the existing problems described above, the purpose of the present invention is to provide a robot autonomous positioning method to solve the technical problem of affecting the positioning accuracy of the robot due to inaccurate image alignment in the existing robot autonomous positioning technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous positioning method for robot
  • Autonomous positioning method for robot
  • Autonomous positioning method for robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0038] In the present embodiment, the autonomous positioning method of the robot comprises the following steps:

[0039] 1) The robot collects images of the current environment through the camera.

[0040] 2) Convert the current frame image collected by the camera and the reference image selected as the positioning reference into the HSI color space to obtain three components of H, S, and I.

[0041] 3) Extract the point P in the real environment space from the reference image j The projected point p in the reference image 1 j :

[0042]

[0043] pixel The grayscale value of The color components are represents the projected point the number of rows in the image array, represents the projected point the number of columns in the image array, projected point The image coordinates of .

[0044] In the above formula (1), K is the in...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The autonomous positioning method for a robot comprises the following steps that: 1) enabling a robot to acquire an image of a current environment through a camera; 2) converting the current frame image acquired by the camera and the selected reference image serving as the positioning reference into an HSI color space; (3) extracting a projection point of a point Pj in a real environment space in the reference image from the reference image; (4) calculating the projection point of the point Pj in the current frame image; (5) calculating a projection error rj between the projection points; and 6) continuously iteratively performing solving by minimizing the objective function E (xi) to obtain the optimal solution of the camera pose. According to the autonomous positioning method for the robot, the precision of pose estimation of the robot can be improved by using more image information and constraint relationships through the image registration result with higher precision, namely, the precision of autonomous positioning of the robot is improved.

Description

technical field [0001] The invention relates to the fields of computer vision and robot technology, in particular to a method for autonomous positioning of a robot. Background technique [0002] In the existing robot autonomous positioning method, the visual SLAM direct method adopted uses the assumption of photometric invariance. After the input image is directly converted into a grayscale image, the camera motion and point projection are simultaneously estimated according to the pixel grayscale information of the image. . However, according to the visual characteristics of the human eye, the human eye is more sensitive to color than grayscale; and the grayscale assumptions in the actual camera imaging system will be affected by the camera's automatic exposure and the specular reflection of the object surface. Therefore, using grayscale information alone may lead to failure of image alignment. [0003] Simply put, image alignment, which aims to find the best image transfo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/77G06T7/33
CPCG06T7/74G06T7/77G06T7/337G06T2207/20076
Inventor 薛方正刘世敏岑汝平苏晓杰江涛
Owner CHONGQING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products