Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Self-supervised learning and skeleton information behavior identification method

A technology for supervised learning and recognition methods, applied in character and pattern recognition, instruments, computer parts, etc., can solve the problems of affecting recognition accuracy, complex information, and small data, to enhance robustness and overcome illumination. Changes, the effect of solving performance deficiencies

Active Publication Date: 2021-04-16
SUN YAT SEN UNIV
View PDF7 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] However, in the problem of human action recognition based on human skeleton video information, how to effectively extract the timing dependence information between different video frames is the main difficulty
Because compared with RGB images, human skeleton data has less data but more complex information, and human behavior is closely related to the action process in the entire video. If the information implicit in the timing cannot be effectively used, the accuracy of recognition will be affected. Rate

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Self-supervised learning and skeleton information behavior identification method
  • Self-supervised learning and skeleton information behavior identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0022] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0023] It should be understood that the step numbers used herein are only for convenience of description, and are not intended to limit the execution order of the steps.

[0024] It should be understood that the terminology used in the description of the present invention is for the purpose of describing particular embodiments only and is not intended to limit the present invention. As used in this specification and the appended claims, the singular forms "a", "an"...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a self-supervised learning and skeleton information behavior recognition method, and relates to the technical field of computer vision. The method comprises the following steps: S1, constructing a configurable depth model; S2, in a network pre-training stage, obtaining a pre-training sample according to a preset optical flow prediction task, wherein the pre-training sample comprises a skeleton video and a label of an optical flow prediction task automatically generated by a machine; training the transformation network by using the pre-training sample to obtain an initial parameter theta' of the converter network; S3, in a network fine tuning stage, initializing the converter network according to the initial parameter theta ', and constructing a fine tuning depth model in combination with the initialized converter network and the randomly initialized fine tuning classification network; and S4, inputting a to-be-identified bone video into the trained fine adjustment depth model, and outputting a classification prediction result by the fine adjustment classification network. According to the invention, on the premise of ensuring high precision, human body behavior identification with better effect, robustness and generalization is realized.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a behavior recognition method of self-supervised learning and bone information. Background technique [0002] Human behavior recognition technology is an important and active basic research topic in the field of computer vision. The technology analyzes and classifies images or videos containing human actions to predict what is happening in the scene. With the development of video acquisition sensors and video surveillance, human behavior recognition has gradually become a research content with a wide range of application scenarios in intelligent monitoring, human-computer interaction, intelligent robots, etc., and has attracted more and more researchers' attention. At present, human behavior recognition is mainly aimed at the understanding of video data. [0003] At present, the research on human behavior recognition is mainly divided into two categories, namely recogni...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCY04S10/50
Inventor 张冬雨成奕彬林倞
Owner SUN YAT SEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products