Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A human behavior recognition method and recognition system based on deep neural network

A technology of deep neural network and recognition method, which is applied in the direction of neural learning method, biological neural network model, character and pattern recognition, etc. It can solve the problems of limited types of human behavior recognition, slow recognition speed, and low recognition accuracy, so as to improve behavior The effect of recognizing speed, improving accuracy, and protecting user privacy

Inactive Publication Date: 2018-08-24
SHENZHEN UNIV
View PDF2 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the purpose of the embodiments of the present invention is to provide a human behavior recognition method and recognition system based on a deep neural network, which aims to solve the problems of limited types of human behavior recognition, low recognition accuracy, and slow recognition speed in the prior art. question

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A human behavior recognition method and recognition system based on deep neural network
  • A human behavior recognition method and recognition system based on deep neural network
  • A human behavior recognition method and recognition system based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. A human behavior recognition method and recognition system based on deep neural network

[0046] The specific embodiment of the present invention provides a kind of human behavior recognition method based on deep neural network, mainly comprises the following steps:

[0047] S11. Obtain the original depth data stream of the actor;

[0048] S12. Extract the skeleton joint point data of the human body through the original depth data stream of the actor;

[0049] S13. Using the extracted three-dimensional coordinates corresponding to the joint point data of the human body skeleton to model the entire ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human body behavior recognition method based on a deep neural network, comprising: obtaining the original depth data stream of the actor; extracting the skeleton joint point data of the human body through the original depth data stream of the actor; using the extracted human skeleton joints The three-dimensional coordinates corresponding to the point data are used to model the entire human body; feature extraction is performed by modeling the entire human body, and the feature data is sent to the restricted Boltzmann machine network for preprocessing, and the weights obtained are initialized to the BP neural network Network parameters, train a deep neural network model, and perform behavior recognition on the result of feature extraction; use multi-threaded parallel processing to overlap the extracted human skeleton joint point data with the actual human body, and perform the recognized behavior Real-time display; build abnormal behavior template library and alarm the detected abnormal behavior. The invention can detect the change of the human body behavior in real time, and give an alarm to the abnormal behavior of the human body (such as falling).

Description

technical field [0001] The invention relates to the field of video recognition, in particular to a human behavior recognition method and recognition system based on a deep neural network. Background technique [0002] As we all know, now is the era of aging of the human body. my country has the largest elderly population in the world. According to reports, the elderly population will reach 248 million by 2020. Coupled with the implementation of family planning, the only child is busy with work. Lives independently most of the time. Many abnormal behaviors cannot be detected in time, so that the tragedies of the elderly delaying treatment and losing their lives occur from time to time, seriously affecting the quality of life of the elderly. For example, falling behavior is the direct cause of injury death among the elderly over 65 years old in my country. If we can accompany the empty-nest elderly in their daily life, effectively identify their behavior, and detect abnormal ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06N3/08
CPCG06N3/08G06V40/103
Inventor 陈亮龙伟王娜李霞
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products