Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method of human action recognition

A human action recognition and action technology, applied in the field of human action recognition, can solve the problems of reducing the accuracy of action recognition, losing action information, and destroying action forms, etc.

Active Publication Date: 2021-02-19
SUZHOU UNIV
View PDF9 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method not only loses part of the action information, but also destroys the shape of the action in time, that is, some stages of the action become longer and some stages become shorter, which will reduce the accuracy of action recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method of human action recognition
  • A method of human action recognition
  • A method of human action recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0092] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0093] Such as figure 1 As shown, a human action recognition method, the specific process is as follows:

[0094] 1. Take the direction from the left shoulder joint to the right shoulder joint in the first frame as the positive direction of the x-axis, take the direction from the hip bone to the cervical spine joint as the positive direction of the y-axis, and take the frontal orientation of the face as the positive direction of the z-axis, and normalize the c...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes an action recognition method, which includes a regularized coordinate system, which standardizes the three-dimensional coordinates of the bone joint points of each frame of the action sample; designs a second-order self-circulating neural network, extracts the coordinate features of the bone joint points in each frame, and obtains The feature vector of each frame; select one of the training samples of each action type as the reference action sample for the action type, match each frame in the test action sample with each frame of each reference action sample, and establish a time sequence correspondence ; Calculate the matching cost between the test action sample and each reference action sample, find the reference action sample with the smallest matching cost with the test action sample, and the action type of the reference action sample is the action type of the test action sample. The feature dimension extracted by the invention is small, and the algorithm efficiency is high. Through time series matching, the method has good adaptability to the difference in time length and time form of the same type of action, and improves the accuracy of action recognition.

Description

technical field [0001] The invention relates to a human body action recognition method, which belongs to the technical field of human body action recognition. Background technique [0002] Machine vision is an important research direction in the field of artificial intelligence. Action recognition based on machine vision has a wide range of applications in the fields of human-computer interaction, virtual reality, video retrieval, and security monitoring. With the development of related technologies such as depth cameras, people can easily obtain depth images and extract information about human bone joints. Action recognition based on human skeleton joints has significant advantages over traditional two-dimensional image-based action recognition. Although many remarkable progresses have been made in the research of action recognition based on human skeleton joints, action recognition is still a very challenging task. [0003] A complete action sequence is composed of the a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/044G06N3/045G06F18/2411
Inventor 杨剑宇赵晓枫朱晨
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products