Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Simultaneous Visual Servo and Adaptive Depth Recognition Method for Mobile Robots

A mobile robot, visual servo technology, applied in two-dimensional position/channel control and other directions, can solve the problems of increasing system complexity and cost, and achieve the goal of increasing system complexity and cost, solving the problem of global stability of the system, and achieving good perception. Effect

Inactive Publication Date: 2019-09-17
TIANJIN POLYTECHNIC UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, it solves the problem of identifying depth information in the existing mobile robot visual servoing, and does not need to add a distance sensor, without increasing the complexity and cost of the system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Simultaneous Visual Servo and Adaptive Depth Recognition Method for Mobile Robots
  • A Simultaneous Visual Servo and Adaptive Depth Recognition Method for Mobile Robots
  • A Simultaneous Visual Servo and Adaptive Depth Recognition Method for Mobile Robots

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0073] First, define the system coordinate system

[0074] Section 1.1, System Coordinate System Description

[0075] Define the coordinate system of the on-board camera to be consistent with the coordinate system of the mobile robot. by Represents the Cartesian coordinate system of the expected pose of the robot / camera, where The origin of is at the center of the wheel axis, which is also the optical center of the camera. z * The axis coincides with the optical axis of the camera lens, and also coincides with the forward direction of the robot; x * The axis is parallel to the axis of the robot; y * axis perpendicular to x * z * plane (mobile robot motion plane). by Indicates the current pose coordinate system of the camera / robot.

[0076] Let e(t) represent the distance between the desired position and the current position; θ(t) represents compared to The rotation angle of the robot; α(t) represents the current pose of the robot and the

[0077] arrive T...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method for simultaneous performing visual servo and adaptive depth identification through a robot belongs to the computer vision and mobile robot technology field. The method comprises: obtaining an open loop kinematical equation with stable error according to a robot pose position polar coordinate representation method; and designing an adaptive updating rule capable of identifying depth information according to a concurrence learning strategy, and constructing a vision stabilization control rule of a mobile robot. The parameter adaptive updating rule designed by the invention can perform learning at the initial stage of the robot performing stabilization motion and perform online identification of the depth information in the robot motion process. The method for simultaneous performing visual servo and adaptive depth identification through the robot can prove the simultaneous convergence of the pose posture errors and the depth identification errors according to a Lyapunov method and a LaSalle invariance principle. The method for simultaneous performing visual servo and adaptive depth identification through the robot can accurately and reliability identify the depth information while the mobile robot completes the vision stabilization control so as to prove the simultaneous convergence of the controller and the identification module.

Description

technical field [0001] The invention belongs to the technical field of computer vision and mobile robots, and in particular relates to a simultaneous visual servo and adaptive depth recognition method for a mobile robot. Background technique [0002] For mobile robot systems, the introduction of visual sensors can greatly enhance its intelligence, flexibility and environmental perception [1-3] (see appendix literature [1-3], the following expressions are all literature in the appendix). The movement of mobile robots controlled by real-time image feedback, that is, visual servo technology, can be widely used in various fields, such as intelligent transportation and environmental exploration. For these reasons, this technology has received special attention and has become a research hotspot in the field of robotics. For vision sensors, since they are imaged according to a perspective projection model, the lack of depth information is its main defect. Therefore, for the monoc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G05D1/02
Inventor 李宝全师五喜邱雨郭利进陈奕梅
Owner TIANJIN POLYTECHNIC UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products