Check patentability & draft patents in minutes with Patsnap Eureka AI!

An object classification method based on feature extraction of tactile information of multi-fingered manipulator

A technology of feature extraction and classification methods, applied in the direction of manipulators, computer components, instruments, etc., can solve the problems of scattered extraction and analysis, limited pressure sensitivity, and inability to fully adapt to tactile information characteristics

Active Publication Date: 2019-06-25
浙江浙大西投脑机智能科技有限公司
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] (1) The application of tactile sensors is still in its infancy. It is only used for simple contact judgments, etc., and is limited to pressure sensing. Especially the domestic team ignores the pressure, temperature and humidity, and cannot complete the complete tactile generation and simulation
(2) The feature extraction and analysis for tactile information is relatively fragmented and cannot fully adapt to different tactile information features
(3) Existing manipulators can only engage in simple repetitive mechanized work, and cannot form a self-feedback closed loop to complete complex operations, such as adaptive grasping and classification of many different objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An object classification method based on feature extraction of tactile information of multi-fingered manipulator
  • An object classification method based on feature extraction of tactile information of multi-fingered manipulator
  • An object classification method based on feature extraction of tactile information of multi-fingered manipulator

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In order to describe the present invention more specifically, the technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0030] In this embodiment, the prototype uses a multi-fingered manipulator, the arm part is an Epson C4-601 manipulator, and the gripper part is a Robotiq three-finger manipulator. The combination of the two can complete complex arm movement and grasping operations for multiple types of objects. The sensors used in this embodiment are flexible sheet pressure sensors and miniature temperature and humidity sensors. The pressure sensors are arranged in a 2×2 dot matrix, or after further reducing the area of ​​a single pressure sensor, they can be arranged in a 3×3 or 4×4 dot matrix. The specific arrangement is as follows: figure 1 shown.

[0031] The data of all sensors is received by the lower computer of Arduino and preprocessed, and then transferred to t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an object classification method based on the feature extraction of tactile information of a multi-fingered manipulator. In the classification method, multiple types of tactile-related sensors are deployed on the fingertips of a multi-fingered manipulator in an array, combined with the grasping and processing of objects by the manipulator. Swipe to obtain various types of tactile information, including but not limited to pressure, vibration, temperature and humidity, etc., process different types of tactile information separately, simulate the various components of the tactile sense, thereby obtaining various information of objects, and support the use of tactile information The vector machine method is used for training and learning, and the tactile components such as rigidity, temperature and humidity, surface texture, and thermal conductivity are mapped to the feature vector, so that various objects can be classified online or offline, thereby realizing the bionics of human touch.

Description

technical field [0001] The invention belongs to the technical field of bionic application of manipulators, and in particular relates to an object classification method based on feature extraction of tactile information of multi-finger manipulators. Background technique [0002] In recent years, robot bionics has developed rapidly, and robots are more and more widely used in industry and service industries. At the same time, all kinds of sensors are emerging in an endless stream, and the sensors have made great progress in terms of volume, measurement accuracy and deployment form. But at present, robots are still engaged in mechanized and repetitive actions, and the application of self-feedback mechanism is still rare. In addition, the bionics of robots has made great progress in vision and hearing, but there is still a big gap in the sense of touch. Even if there is, the current sense of touch is only used for simple contact judgment and other fields, and it is not complete...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62B25J13/08
CPCB25J13/084G06F2218/08G06F18/2411
Inventor 李石坚叶振宇焦文均陶海杨莎潘纲
Owner 浙江浙大西投脑机智能科技有限公司
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More