Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Walking robot self-adaptation calibration method, system and facility and storage medium

A walking robot and calibration method technology, applied in manipulators, program-controlled manipulators, manufacturing tools, etc., can solve the problems of inconvenient use, lengthened visual experiment period, long time-consuming hand-eye calibration, etc., to improve accuracy and improve motion accuracy. Effect

Active Publication Date: 2019-04-26
NEXTVPU SHANGHAI CO LTD
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the general hand-eye calibration process is complicated and requires manual intervention, or taking marker points, or manually recording data, which is inconvenient to use. However, hand-eye calibration is often used in vision experiments, whether it is the position of the camera or the position of the robot arm. Or when the type of the robot arm changes, the hand-eye calibration must be performed again, while the traditional hand-eye calibration takes a long time, which greatly lengthens the cycle of the visual experiment

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Walking robot self-adaptation calibration method, system and facility and storage medium
  • Walking robot self-adaptation calibration method, system and facility and storage medium
  • Walking robot self-adaptation calibration method, system and facility and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0070] Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The same reference numerals denote the same or similar structures in the drawings, and thus their repeated descriptions will be omitted.

[0071] figure 1 It is a flowchart of the self-adaptive calibration method of the walking robot of the present invention. Such as figure 1 As shown, the adaptive calibration method of the walking robot of the present invention comprises the following steps:

[0072] S101, the data acquisition step, assuming that the walking robot moves in a plane coordinate system based on the ground, and collecting the actual linear vel...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a walking robot self-adaptation calibration method, system and facility and a storage medium. The method comprises the steps that a first weight coefficient related to the leftwheel linear speed of a walking robot, a second weight coefficient related to the right wheel linear speed and a third weight coefficient related to the angular speed are set; current theoretical coordinates are obtained through positioning iteration, and current actual coordinates are obtained through video positioning; and according to deflection of the current theoretical coordinates and the current actual coordinates, at least one set of first weight coefficient, second weight coefficient and third weight coefficient are solved, so that the theoretical coordinates are obtained. According to the walking robot self-adaptation calibration method, system and facility and the storage medium, for different application scenes, according to scene changes and robot motion character changes, self-adaptation calibration can be conducted, the robot can achieve good motion precision under various scenes, and the calibration accuracy and flexibility of application to different scenes are greatlyimproved.

Description

technical field [0001] The invention relates to the field of robot calibration, in particular to an adaptive calibration method, system, equipment and storage medium of a walking robot. Background technique [0002] With the development of computer technology, computer vision, as an important research field of artificial intelligence, has been widely used in various industries. Combining computer vision technology with robotics has also enabled the field of intelligent robotics to develop vigorously. For the grasping of the manipulator, the manual teaching method is traditionally used, such as breaking the manipulator by hand, so that the manipulator can go to a fixed position for grasping. This method is relatively inefficient and because the manipulator has no sense of the surrounding environment. , if the position of the robotic arm or the position of the object changes, the robotic arm cannot grasp the object. [0003] Applying computer vision to the field of robotics ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16
CPCB25J9/161B25J9/1653
Inventor 张华周骥冯歆鹏
Owner NEXTVPU SHANGHAI CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products