Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Foot-type robot terrain perception method based on virtual sensor

A virtual sensor and terrain perception technology, applied in the field of robot perception, can solve problems such as the impact of the robot's movement performance, the inability to complete predetermined tasks, and the reduction of system robustness. Dynamically responsive effects

Active Publication Date: 2020-06-09
BEIJING INST OF SPACECRAFT SYST ENG
View PDF4 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Terrain perception is extremely important for outdoor operations, especially for deep space exploration robots. Complex unstructured geological conditions have a great impact on the mobility of robots, making them unable to complete scheduled tasks and even causing danger to themselves.
It is difficult to obtain the terrain structure and physical characteristics under the surface soil with pure vision-based terrain perception methods, and most of the tactile-based perception methods require the installation of tactile sensors on the robot's feet, which interact with various terrains and are easily damaged , thus reducing the robustness of the system
Moreover, the spacecraft has the characteristics that on-orbit failures are not easy to repair, and the tactile sensing system will seriously affect the reliability of the entire system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Foot-type robot terrain perception method based on virtual sensor
  • Foot-type robot terrain perception method based on virtual sensor
  • Foot-type robot terrain perception method based on virtual sensor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0040] Step (1) Modeling of terrain-aware machine learning model

[0041] The modeling of the terrain perception machine learning model is mainly divided into two parts: touchdown detection neural network modeling and soil classification machine learning model modeling

[0042] 1) Touchdown detection neural network modeling

[0043]A neural network learning algorithm considering derivative information is established to establish a neural network model for the ground contact detection of a legged robot to achieve ground contact detection. The neural network model of touchdown detection includes three layers: input layer, hidden layer and output layer, and each layer is connected by function operation relation. The input signal of the input layer is the joint angle, joint angular velocity and joint motor current of each joint of a single leg. If the number of leg joints is N, the input layer of the touchdown detection neural network model contains 3N nodes. The output signal ...

Embodiment 2

[0091] A method for terrain perception using virtual sensors for legged robots, which mainly includes four components: (1) sample data collection; (2) terrain perception machine learning model modeling; (3) machine learning model algorithm; (4) Virtual Sensor Terrain Awareness System. The composition of each part is described in detail below.

[0092] (1) Sample data collection

[0093] In order to verify the effectiveness of this method, a hexapod robot's single-leg ground-touching experiment is designed. The schematic diagram of the experimental scheme is as Figure 5 As shown, a one-dimensional force sensor is installed on the sole of one leg of the robot, and the sampling frequency is 1000 Hz. In the experiment, the robot touched the ground with different centroid heights, different walking speeds, and different materials. Among them, the single walking cycle is taken as: 3s, 4s, 5s, 6s, 7s, 8s, 9s, 10s; the height of the centroid: 0.36m, 0.42m; the ground material: al...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A foot-type robot terrain perception method based on a virtual sensor belongs to the field of robot perception, and comprises the following steps: S1, establishing a ground contact detection neural network model and a soil classification machine learning model; S2, under the conditions of different terrains and different gaits, collecting the angle of a leg joint, the angular speed of the leg joint, the motor current and the contact force data of the leg and the ground of the foot-type robot as samples; S3, training a ground contact detection neural network model and a soil classification machine learning model by using the samples collected in the S2; and S4, taking the ground contact detection neural network model and the soil classification machine learning model trained in the S3 as aterrain perception system of the foot-type robot for terrain perception. According to the method, the walking stability and the movement capacity of the robot can be improved, and meanwhile the robustness of the robot and the reliability of behaviors of the robot are enhanced; in addition, hardware of the robot is simplified, and the design, machining and maintenance cost is reduced.

Description

technical field [0001] The invention relates to a terrain perception method for a legged robot based on a virtual sensor, relates to a terrain perception method for a legged robot working outdoors, is suitable for the legged robot to acquire internal and external environment information, and belongs to the field of robot perception. Background technique [0002] Terrain perception is extremely important for outdoor operations, especially for deep space exploration robots. Complex unstructured geological conditions have a great impact on the mobility of robots, making them unable to complete predetermined tasks and even causing danger to themselves. It is difficult to obtain the terrain structure and physical characteristics under the surface soil with pure vision-based terrain perception methods, and most of the tactile-based perception methods require the installation of tactile sensors on the robot's feet, which interact with various terrains and are easily damaged , there...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N20/00G06K9/62G01B21/22G01N33/24G01P3/00G01R19/00
CPCG06N3/08G06N20/00G01N33/24G01B21/22G01P3/00G01R19/00G06F18/217G06F18/214
Inventor 吴爽危清清陈磊张沛王储刘宾姜水清李德伦刘鑫白美
Owner BEIJING INST OF SPACECRAFT SYST ENG
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products