Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time human body action recognition and counting method

A technology of human action recognition and counting method, which is applied in character and pattern recognition, calculation, and counting input signals from several signal sources, etc. Accurate counting and real-time recognition of the effect of human movements

Active Publication Date: 2021-05-14
LINEWELL SOFTWARE +1
View PDF8 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Based on this framework, the posture of the human body can be described and the key points of the skeleton can be identified, but there is still a lack of action recognition during the movement of the human body, and the existing human action recognition cannot achieve real-time accurate counting

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time human body action recognition and counting method
  • Real-time human body action recognition and counting method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0032] Such as figure 1 and figure 2 Shown, a kind of real-time human action recognition and counting method of the present invention comprises:

[0033] Step 10. Obtain video data, obtain human skeleton point data, and generate time-series action data;

[0034] Step 20, perform fixed-point judgment on the action data according to the position information of the skeleton point in the time context, and determine its action category;

[0035] Step 30: Acquire the action data of the determined action category, intercept the action data according to a time interval, analyze the relevant skeleton points involved in the intercepted action data, generate the action pattern sequence of the corresponding skeleton points, and calculate the The pattern distance between the action pattern sequence and the action pattern sequence of the predefined standard action sequence, judge whether the pattern distance is less than a threshold, if so, add 1 to the count, otherwise the count remains...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a real-time human body action recognition and counting method, which comprises the following steps: acquiring video data, obtaining human body skeleton point data, and generating action data of a time sequence; performing fixed-point judgment on the action data according to the position information of skeleton points in the time context, and determining the action category; intercepting action data according to a time interval, analyzing related skeleton points of the action data, generating an action mode sequence of the skeleton points corresponding to the action data, and calculating a mode distance between the action mode sequence and an action mode sequence of a predefined standard action sequence; judging whether the mode distance is smaller than a threshold value or not, if so, adding 1 to the count, otherwise, keeping the count unchanged; and returning the action category and a counting result, and updating and displaying on a user interface. According to the invention, action types are judged through fixed-point recognition, and actions are counted in combination with mode setting and distance calculation, so that real-time and accurate action recognition and counting operation is provided for a user.

Description

technical field [0001] The invention relates to the field of human motion recognition, in particular to a real-time human motion recognition and counting method. Background technique [0002] In the existing human motion recognition technology, a basic motion recognition framework is provided, which can realize the recognition of the bone point position of a single person or multiple people, and realize the pose estimation of human motion, facial expression, finger movement, etc. Based on this framework, the posture of the human body can be described and the key points of the skeleton can be identified, but there is still a lack of action recognition during the movement of the human body, and the existing human action recognition cannot achieve accurate counting in real time. Contents of the invention [0003] The technical problem to be solved by the present invention is to provide a real-time human motion recognition and counting method, which can realize real-time and a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06M3/08
CPCG06M3/08G06V40/23G06V40/103Y02D10/00
Inventor 吴志雄林立成高稳仁麦烤
Owner LINEWELL SOFTWARE
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products