Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

An improved method of ptam based on ground features of intelligent robots

A technology of intelligent robots and robots, applied in the field of robot vision, can solve the problems of camera movement restrictions and the inability to establish metric maps

Inactive Publication Date: 2017-09-01
BEIJING UNIV OF TECH
View PDF1 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the unimproved PTAM algorithm also has the problem of being unable to establish a metric map, and has strict restrictions on the movement of the camera

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An improved method of ptam based on ground features of intelligent robots
  • An improved method of ptam based on ground features of intelligent robots
  • An improved method of ptam based on ground features of intelligent robots

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0091] Below in conjunction with accompanying drawing, the patent of the present invention is described in further detail.

[0092]The PTAM improved algorithm flow chart based on ground features is shown in Figure (1), which specifically includes the following steps:

[0093] Step 1, parameter correction

[0094] Step 1.1, parameter definition

[0095] The robot pose representation is constructed from the relationship between the robot coordinate system and the world coordinate system, and the ground plane calibration parameters are determined by the pose relationship between the camera and the target plane.

[0096] Step 1.2, Camera Correction

[0097] The FOV model is used to correct the monocular camera, and the image pixel coordinates are mapped to the normalized coordinate plane. At the same time, the image distortion correction is realized by combining the camera internal parameter matrix K.

[0098] Step 2, initialization based on ground features

[0099] Step 2.1, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An improved method of PTAM based on the ground features of intelligent robots. First, parameter correction is completed, which includes parameter definition and camera correction; The feature information in the current image establishes the data association between the corner features to obtain the pose estimation model; in the initial drawing stage of the map, two key frames are obtained to set up the camera on the mobile robot; during the initialization process, the mobile robot starts Move, and at the same time, the camera captures the information of the corner points in the current scene and establishes an association; after the initialization of the 3D sparse map, update the key frame and use the epipolar search and block matching method to establish the sub-pixel precision mapping relationship of the feature points, combined with the pose estimation model to achieve Camera precise repositioning. Finally, the matching points are projected into the space to complete the creation of the 3D map of the current global environment.

Description

technical field [0001] The invention belongs to the field of robot vision and relates to an improved PTAM algorithm based on ground features. Background technique [0002] As the relationship between robots and humans is getting closer, the related technologies of intelligent robots have received great attention. Simultaneous Localization and Mapping (SLAM) of mobile robots is one of the most mainstream positioning technologies for intelligent mobile robots. It is actually a motion estimation problem, using the internal and external data acquired by the sensor to calculate the position of the mobile robot at a certain moment, and at the same time establish the map model it depends on. Vision-based SLAM belongs to the research category of visual measurement, because the visual sensor has its own unique advantages: small size, light weight, cheap price, easy installation, and the extracted external information is very rich. These advantages further promote the current resear...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/80G01C21/00
Inventor 贾松敏王可宣璇张鹏董政胤
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products