Unlock instant, AI-driven research and patent intelligence for your innovation.

A monitoring system based on action recognition and a method thereof

A technology of motion recognition and motion, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as inappropriate

Pending Publication Date: 2019-04-09
李刚毅
View PDF3 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This technology is not suitable for distinguishing independent human skeleton models, and classifying and judging poses for skeleton models (rather than raw video features), so it is not suitable for judging actions based on pose sequence

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A monitoring system based on action recognition and a method thereof
  • A monitoring system based on action recognition and a method thereof
  • A monitoring system based on action recognition and a method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numerals in different drawings refer to the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatuses and methods consistent with aspects of the present disclosure as recited in the appended claims.

[0035] The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to limit the present disclosure. Unless otherwise defined, all other scientific and technical terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. As used in this disclosure...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a monitoring system and method based on action recognition. The method comprises the steps: recognizing the limb position of a person in a monitored video frame through an attitude estimation method, and carrying out human skeleton 2D modeling; Classifying the 2D model of the human skeleton in the monitored video frame by using a pre-trained posture classification model; Storing an attitude classification result in the continuous video frame into an attitude vector, and judging an action type according to a pre-trained action recognition model; And if the judged actiontype belongs to the monitored type, storing the video frame marked to make the specific action and / or the video clip of the action into a memory and triggering an alarm.

Description

technical field [0001] The present disclosure relates to a monitoring system and method based on motion recognition, and in particular to using gesture prediction technology, gesture recognition technology, and motion recognition technology to determine whether a person in a monitored video has made a specific motion, and if a specific motion is detected, automatically The system and the method thereof provide an alarm and save related video frames and video file fragments for future reference. Background technique [0002] The actions of the object are decisive in judging the behavior of the object. Regardless of whether the object is a human, an animal, or a machine, in order to achieve the set goal, it needs to achieve it through corresponding actions. [0003] Chinese invention patent application publication CN107992858A proposes a real-time 3D gesture estimation system and method based on a single RGB frame, which uses a hand detector to detect and frame the hand area,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32
CPCG06V40/20G06V10/25
Inventor 李刚毅
Owner 李刚毅