Unlock instant, AI-driven research and patent intelligence for your innovation.

Real-time human body action recognition method based on joint point sequence

A technology for human action recognition and human action, applied in the field of computer vision, can solve the problem of missing action information, and achieve the effect of reducing overhead

Pending Publication Date: 2022-08-05
SHANDONG ARTIFICIAL INTELLIGENCE INST +1
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At the same time, although some existing methods use sliding windows, setting a preset number of image sequence frames and other methods to complete real-time human action recognition, these methods do not solve the problem of action positioning in videos very well, and often miss some important action information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time human body action recognition method based on joint point sequence
  • Real-time human body action recognition method based on joint point sequence
  • Real-time human body action recognition method based on joint point sequence

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0063] The steps of collecting the human body posture image in step a) are:

[0064] a-1.1) Connect a depth camera to the computer, and capture human pose images through the depth camera. The test subject is in front of the camera to show a number of human movements according to the specified requirements. Human movements include but are not limited to squatting, bending over, arm stretching, kicking, push-ups and other human movements. The collection of human movements is obtained by multiple testers under various camera angles and different indoor environments, and the human movements are displayed uniformly in accordance with the requirements of the specification. Human pose images are instantaneous pose images captured by the camera at the beginning and end of human actions.

[0065] a-1.2) Align the color frame colorframe of the human pose image captured by the depth camera with the depth frame depthframe to obtain the alignment frame alignframe.

[0066] a-1.3) Send t...

Embodiment 2

[0069] The steps of collecting human action sequence data in step a) are:

[0070] a-2.1) Detect two-dimensional joint point coordinates through the depthframe.get() function in the pyrealsense2 function library Get the depth distance between the i-th joint point and the depth camera

[0071] a-2.2) The three-dimensional coordinates of the joint point of the i-th human body in a single frame are is the X-axis coordinate value of the joint point of the i-th human body in a single frame, is the Y-axis coordinate value of the joint point of the i-th human body in a single frame, is the Z-axis coordinate value of the joint point of the i-th human body in a single frame, where ppx is the abscissa of the projection center of the depth camera, ppy is the ordinate of the projection center of the depth camera, fx is the abscissa of the focal length of the depth camera, and fy is the ordinate of the focal length of the depth camera. ppx, ppy, fx, fy can be obtained throug...

Embodiment 3

[0074] The depth camera in step a-1) is an Intel RealSense Depth Camera D435 depth camera.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

According to the real-time human body action recognition method based on the joint point sequence, a depth camera and a two-dimensional human body posture detection model are utilized, a human body action sequence data set is easily obtained and input into a human body action recognition network, actions displayed by a testee can be judged in real time, and the freedom of the testee is not limited. According to the human body action network model, the posture characteristics between the image frames can be well utilized, and the spatial characteristics and the spatial-temporal characteristics of the actions of the testee can be well captured. Meanwhile, redundant posture information between adjacent frames can be remarkably removed through the preprocessing method for the human body action sequence data, and the expenditure of a human body posture recognition network is effectively reduced.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a real-time human action recognition method based on a joint point sequence. Background technique [0002] With the rapid development of artificial intelligence and computer vision technology and its wide application in life, human action recognition has become an emerging research direction. The recognition of human movements is also of great help to people's lives, such as pedestrian fall detection, sports analysis, and human health risk assessment. The goal of action recognition is to identify the actions that appear in the video, usually the actions of people in the video. Video can be seen as a data structure composed of a set of image frames arranged in time sequence, with an additional time dimension than images. Therefore, human action can be regarded as a chronological arrangement of several human postures, which is a continuous process, from the starting postu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V40/20G06K9/62G06N3/04G06N3/08G06V10/44G06V10/774G06V10/82
CPCG06N3/08G06N3/045G06F18/214
Inventor 王字成李金宝舒明雷
Owner SHANDONG ARTIFICIAL INTELLIGENCE INST