Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-machine cooperation system gesture recognition control method based on deep learning

A gesture recognition, human-machine collaboration technology, applied in neural learning methods, character and pattern recognition, computer parts and other directions, can solve problems such as low accuracy, unsafe hidden dangers, and machine injury, achieve high accuracy and solve collaborative safety problems. and the effect of high precision work problems

Active Publication Date: 2018-09-21
XIAN UNIV OF TECH
View PDF5 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a gesture recognition control method for a human-machine collaboration system based on deep learning, which solves the problems of industrial robot collaboration such as low precision, large amount of data, machine injury and unsafe hidden dangers existing in the existing human-machine collaboration technology

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-machine cooperation system gesture recognition control method based on deep learning
  • Human-machine cooperation system gesture recognition control method based on deep learning
  • Human-machine cooperation system gesture recognition control method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0019] The present invention is a gesture recognition control method of a human-computer cooperation system based on deep learning, comprising the following steps:

[0020] Step 1. Track the gesture in real time and obtain the gesture image of the operator;

[0021] In step 1, the real-time tracking of the gesture and the acquisition of the operator's gesture image specifically include: using the kinect somatosensory camera to collect the user's gesture image; using the camshift tracking algorithm to confirm that the gesture image is within the collection range, and performing real-time tracking and sampling; The operator's gesture RGB image and depth image are transmitted to the PC through USB for processing.

[0022] Use the camshift tracking algorithm to confirm that the gesture image is within the collection range, and perform real-time ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human-machine cooperation system gesture recognition control method based on deep learning. The method comprises the following steps that: S1: tracking a gesture in real time, and obtaining the gesture image of an operator; S2: through a deep learning algorithm, automatically learning gesture image features, and identifying and classifying various postures of the same gesture; and S3: according to the recognized and classified gesture information, sending a corresponding preset robot control instruction. By use of the method, workpiece target collection is characterized in that a camera can be automatically selected according to workpiece characteristics for processing the workpiece characteristics, and then, the workpiece characteristics are transmitted to a processing side to finish communication configuration. The operator can carry out remote monitoring under a remote computer, in addition, various gestures of the same gesture action can be accurately identified and classified after the operator carries out replacement, and a true sense of human-machine cooperation is realized.

Description

technical field [0001] The invention belongs to the technical field of collaborative robot control, and relates to a gesture recognition control method for a human-machine collaborative system based on deep learning. Background technique [0002] At present, most collaborative robot systems mainly use pre-taught path trajectories and actions for operations. This method has many blind spots and potential safety hazards. In many harsh, harmful, high-temperature, high-pressure and dangerous environments, as well as in medical, chemical, and industrial production, there are unpredictable consequences and difficult close-range operations. For example, the repair and maintenance of nuclear power plants, high-voltage live-line operations, crater sampling, space exploration, etc., all need to be completed by robots instead of humans. [0003] In recent years, in medical surgery, small parts assembly in industrial production, and dangerous chemical experiments, although robots have ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/08G06F3/01
CPCG06F3/017G06N3/084G06V40/28G06F18/24
Inventor 杨延西杨志伟
Owner XIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products