Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Behavior detection method

A detection method and behavioral technology, applied in neural learning methods, instruments, biological neural network models, etc., can solve problems such as low interpretability and difficult detection

Pending Publication Date: 2021-12-07
南京康博智慧健康研究院有限公司
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Accurate behavior detection is still a difficult goal
Recent advances in machine learning have enabled limb localization, however, although informative, the corresponding behavioral interpretability of limb position or pose is rather low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Behavior detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] Readily appreciated that the solution according to the invention, without changing the true spirit of the invention, those skilled in the art can make a variety of structures and alternative ways to achieve another embodiment. Accordingly, the following detailed description and drawings are merely illustrative of the technical solution of the present invention and should not be considered all or is deemed to define or limit the invention to the aspect of the present invention.

[0029] like figure 1 , The present invention provides technical solutions: A method for detecting behavior, comprising the steps of:

[0030] The first step, the body feature extraction point coordinate information

[0031] S1-1, default body posture acquisition device, and the establishment of the time series, to obtain data set information includes all human pose estimation in time sequence in;

[0032] S1-2, respectively Key Info dataset body pose estimation algorithm and information Openpose Dee...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a behavior detection method. The behavior detection method comprises: extracting human body feature point coordinate information; enhancing the coordinate information content of the human body feature points to obtain kinematics information of human body postures; combining the human body feature point coordinate information and the kinematics information to obtain high-dimension information, and meanwhile, preseting a nonlinear dimension reduction algorithm to obtain low-dimension effective data subjected to redundancy elimination and low signal-to-noise ratio; and performing unsupervised clustering on the low-dimensional effective data, constructing convolutional neural network training to form a behavior recognition classifier, and outputting and displaying the motion behavior of the human body posture. After the attitude mode is obtained by fusing the relevance information of the attitude and the action and kinematics parameters and adopting an unsupervised algorithm, classifier training is carried out by using 1DCNN (one-dimensional convolutional neural network) and combining continuous frame time sequence information to obtain the action category, so that the effect of behavior recognition is achieved, and the problem that accurate behavior detection is relatively difficult is solved.

Description

Technical field [0001] The present invention relates to the field of human gesture detection and recognition technology, in particular to a method for detecting behavior. Background technique [0002] Currently precise behavior detection is still a difficult target. Recent advances in machine learning makes it possible to locate the limb, however, despite the physical position or posture can provide information, but the corresponding behavior interpretability quite low. Extraction behavior information needed to determine the spatial and temporal patterns of these locations. [0003] Accordingly, the present invention related information and kinematic parameters of the attitude and the fusion operation, after using unsupervised algorithm gesture mode, 1DCNN (one-dimensional convolutional neural network) and the frame timing information continuously combined classifier training operation to give category, in order to achieve the effect of behavior recognition, so as to solve the pr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/23G06F18/24
Inventor 朱樊顾海松
Owner 南京康博智慧健康研究院有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products