A robot tactile action recognition system and recognition method

A technology of motion recognition and robotics, which is applied in manipulators, program-controlled manipulators, manufacturing tools, etc., can solve the problems of less information, more information, and occupying computer resources, etc., to achieve improved accuracy, strong robustness, and fault tolerance Good results

Active Publication Date: 2022-04-22
HEBEI UNIV OF TECH
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional single-point tactile sensors have disadvantages such as difficult positioning, less information, and poor repeatability, while array tactile sensors can collect more comprehensive and comprehensive data, which provides a prerequisite for effectively extracting the characteristics of tactile information.
Because the information volume of the tactile sensor has multi-dimensional information connotation, the large amount of information will occupy a large amount of computing resources of the robot controller; at the same time, the existing tactile sensor and hardware platform only have the function of data perception and transmission, and do not have intelligent functions such as underlying calculation, judgment and recognition , it is difficult to realize the tactile information acquisition, calculation, and cognition of the whole robot skin, which will bring a large amount of calculation and storage burden to the robot controller, and cannot realize the intelligentization of robot tactile sensation. For example, the document with the publication number CN108681412A discloses an array based The emotion recognition device and method of the type tactile sensor has three types of emotion recognition, and there are few types; the algorithm is not embedded in the microcontroller, and the intelligent recognition is completed in the host computer, and the calculation burden of the host computer is Large; unable to run multiple algorithms at the same time, it will take up a lot of computer resources

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A robot tactile action recognition system and recognition method
  • A robot tactile action recognition system and recognition method
  • A robot tactile action recognition system and recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0066] In this embodiment, in (1) of the first step, the carrier is the forearm of the robot, m=32, n=32, x=100, M=25, R=12, N=40 (20 times for the right hand, 20 times for the left hand ), P=12000, u=32. The 12 kinds of characteristic actions are pulled, pinched, pushed, grasped, grasped, poked, pulled, hit, stroked, scratched, patted, and slid. The embedded microcontroller 2 is connected with the host computer 3 through a USB data line.

[0067] In (2) of the first step, the format type of the file in any program readable format of i_j is i_j.txt, i_j.xls or i_j.xlsx.

[0068] In the first step (3), the deep learning framework adopts the pytorch deep learning framework. a=80, the highest accuracy rate of the verification set is 94.57%. The save format of the convolutional neural network model is .tflite.

[0069] Depend on Figure 4 It can be seen that the convolutional neural network architecture consists of seven convolutional layers and a pooling layer, c represents t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a robot tactile action recognition system and a recognition method. The identification system includes an array tactile sensor, an embedded microcontroller and a host computer; the array tactile sensor is worn on a carrier, the array tactile sensor is connected to the embedded microcontroller, and the embedded microcontroller is connected to the upper computer. The identification system has the functions of tactile information measurement, data collection and storage, information feature extraction, action recognition and other functions in the process of human-machine physical contact. This recognition method collects tactile information data by applying different characteristic actions to the tactile sensor by the participants, and extracts the features of the data, establishes a neural network for training in the host computer, and embeds the discriminant algorithm into the microcontroller. In this way, the pattern recognition of characteristic actions can be realized, and the actions imposed by people on the carrier can be recognized.

Description

technical field [0001] The invention relates to the field of robot tactile perception, in particular to a robot tactile action recognition system and a recognition method. Background technique [0002] In recent years, with the rapid development of intelligent manufacturing technology and various robots, human-computer interaction technology has attracted more and more attention and has become a research hotspot. Robots have gradually appeared in all walks of life and played an increasingly important social role. More and more robots have appeared in densely populated places to undertake service work. Traditional robots work more in a structured environment and need to operate according to specific procedures. Their work routes and operation methods are relatively fixed, and they complete a single task and avoid direct communication with humans during the work process. It limits the diversity of social roles and work tasks of robots, especially service robots, and cannot ac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): B25J19/02B25J9/16
CPCB25J19/02B25J9/1694
Inventor 刘吉晓王鹏侯福宁刘阔郭士杰
Owner HEBEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products