Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human hand trajectory prediction and intention recognition method based on multi-feature fusion

A multi-feature fusion and trajectory prediction technology, applied in the field of visual recognition, can solve problems such as complex working environment, long hand travel, and unstable efficiency

Pending Publication Date: 2022-06-24
HUNAN UNIV
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the working environment may be relatively complex, or the distribution of installation parts and tools may be chaotic and disorderly, and it is often not enough to identify and predict the trajectory of human hands
For example, in an installation scenario, workers usually straighten up or stand up in order to pick up parts that are far away. Parts only need to be grasped by raising the hand; the grasping behavior (action and stroke) of these two types of parts are completely different, and only using the arm position for trajectory prediction will lead to unstable efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human hand trajectory prediction and intention recognition method based on multi-feature fusion
  • Human hand trajectory prediction and intention recognition method based on multi-feature fusion
  • Human hand trajectory prediction and intention recognition method based on multi-feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] like figure 1 A schematic flowchart of the method of the present invention: this multi-feature fusion-based method for human hand trajectory prediction and intention recognition provided by the present invention includes the following steps:

[0029] S1. Obtain the key point data of the face and shoulder of the person; obtain the palm trajectory data;

[0030] S2. Input the human face and shoulder key point data into the support vector machine (SVM) to obtain the face orientation modal information; input the palm trajectory data sequence into the Savitzky-Golay filter (SG filter) to eliminate the trajectory data fluctuate to obtain smooth trajectory data;

[0031] S3. Perform parallel fusion of the two modal information to obtain multi-modal fusion information;

[0032] S4. Input into the LSTM (Long Short-Term Memory) network, and output the predicted trajectory of the palm.

[0033] like figure 2 It is a schematic diagram of the position of key points in the metho...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human hand trajectory prediction and intention recognition method based on multi-feature fusion. The method comprises the following steps: acquiring face and shoulder key point data of a person; acquiring palm track data; inputting the face and shoulder key point data of the person into a support vector machine to obtain face orientation modal information; inputting the palm trajectory data sequence into an SG filter, and eliminating trajectory data fluctuation to obtain smooth trajectory data; carrying out parallel fusion on the two types of modal information to obtain multi-modal fusion information; and inputting into the LSTM network, and outputting a predicted trajectory of the palm. According to the method, the face orientation features are extracted by using part of the face key points, the face orientation features are further fused with the human arm trajectory data, the moving trajectory of the human arm in the space and the final arrival position of the human arm are predicted, and the moving trajectory is efficiently and accurately predicted.

Description

technical field [0001] The invention belongs to the field of visual recognition, and in particular relates to a method for predicting and recognizing human hand trajectory based on multi-feature fusion. Background technique [0002] Due to the nonlinearity, randomness, and diversity of external and internal stimuli, it is difficult to accurately predict human motion; however, for workers on the assembly line, their motion behavior is mainly determined by task goals and Part layout-driven, making good predictions with this contextual information. However, the working environment may be relatively complex, or the distribution of installation parts and tools may be chaotic and disorderly, and it is often not enough to recognize and predict the position and trajectory of the human hand. For example, in the installation scene, workers usually straighten up or stand up in order to pick up parts that are far away, the starting position of the arms is relatively high, and the hand ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V40/20G06V40/16G06V10/80G06V10/82G06K9/62G06N3/04G06T7/20
CPCG06T7/20G06T2207/10016G06T2207/30196G06T2207/30241G06N3/044G06F18/254
Inventor 李智勇甘毅辉陈文锐
Owner HUNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products