A state estimation method and system for multi-modal perception of a legged robot

A state estimation and robotics technology, applied in the field of robotics, can solve the problems of inaccurate dynamic model calculation, reducing the accuracy and stability of state estimation, including not only the posture of the body, but also the position and posture of the foot joints.

Active Publication Date: 2021-09-14
INST OF INTELLIGENT MFG GUANGDONG ACAD OF SCI
View PDF14 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional state assessment method is to complete the state estimation and measurement model by combining the dynamic model of the robot, using the dynamic model to give the expected state, and using various sensors on the robot to provide one or more state quantities of the robot. The method faces several problems: 1) the dynamics are highly nonlinear, and the calculation of the dynamic model of the numerical solution is not accurate; 2) the state space of the legged robot is very large, including not only the posture of the body, but also the position and position of the foot joints. attitude; 3) footed robots interact with the environment through multiple intermittent ground contacts and impacts, making the sensor noisier
[0003] It can be seen that traditional algorithms usually rely on the data fusion of multiple bodies and external sensors for state estimation. However, external sensors are affected by characteristic factors such as light intensity, distance measurement, and sound amplitude, and their measurement reliability cannot be guaranteed. Reduced accuracy and stability of state estimation
In addition, because legged robots interact with the surrounding environment through intermittent foot-ground contact, additional noise will be introduced in the measurement, and how to minimize the impact of noise is also a challenging problem

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A state estimation method and system for multi-modal perception of a legged robot
  • A state estimation method and system for multi-modal perception of a legged robot
  • A state estimation method and system for multi-modal perception of a legged robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0059] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0060] figure 1 It shows a schematic flowchart of a state estimation method for multi-modal perception of a legged robot in an embodiment of the present invention, and the method includes the following steps:

[0061] S101. Measure the acceleration and angular velocity of the legged robot based on the IMU sensor, and predict the state mean and covariance of the legged robot;

[0062] It should be noted that the IMU sensor includes an accelerometer and a gyroscope,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a state estimation method and system for multimodal perception of a legged robot. The method includes: measuring the acceleration and angular velocity of the legged robot based on an IMU sensor, and predicting the state mean and covariance of the legged robot Obtain the angular positions of all joints in the footed robot based on the joint encoder, and calculate the measurement value and predicted value corresponding to the foot point position of each foot in the footed robot; based on the measurement value and the obtained The predicted value is used to calculate the measurement residual of the legged robot; based on the measurement residual and the measurement Jacobian, the state mean value and the covariance are corrected to obtain the final state of the legged robot. The method will improve the accuracy and stability of state estimation for legged robots.

Description

technical field [0001] The invention relates to the field of robots, in particular to a state evaluation method and system for multimodal perception of a legged robot. Background technique [0002] The research and development of legged robots is inspired by mammals, which is an important embodiment of bionics and robotics. It has good environmental adaptability, wide range of motion, strong load capacity, and a certain ability to operate independently. Disaster rescue, military reconnaissance and other tasks have received extensive attention. However, the accuracy of the sensor on the ontology is limited, and the measurement data is uncertain, so a good estimation of its state is required to execute closed-loop control instructions and adapt to the non-structural environment. State estimation includes knowledge of the robot's motion and characteristics, fully describing the robot's motion over time in a known environment, and in order to obtain the best estimate with imper...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): B25J9/16B25J19/02B62D57/032
CPCB25J9/1605B25J9/161B25J9/1633B25J19/02B62D57/032
Inventor 吴鸿敏唐观荣苏泽荣徐智浩鄢武周雪峰
Owner INST OF INTELLIGENT MFG GUANGDONG ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products