Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Network training method and device, action recognition method and device, equipment and storage medium

A technology for network training and human action recognition, which is applied in character and pattern recognition, biological neural network models, instruments, etc., and can solve problems such as interference and noise in action recognition

Active Publication Date: 2020-12-11
TENCENT TECH (SHENZHEN) CO LTD
View PDF11 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In related technologies, since the visibility of the key points of the human skeleton is greatly affected by the posture of the human body and environmental factors, there will be a certain amount of noise in the sequence of human skeleton points estimated by the sensor, which will interfere with subsequent action recognition.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Network training method and device, action recognition method and device, equipment and storage medium
  • Network training method and device, action recognition method and device, equipment and storage medium
  • Network training method and device, action recognition method and device, equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] In order to make the purpose, technical solutions and advantages of the application clearer, the application will be further described in detail below in conjunction with the accompanying drawings. All other embodiments obtained under the premise of creative labor belong to the scope of protection of this application.

[0063] In the following description, references to "some embodiments" describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or a different subset of all possible embodiments, and Can be combined with each other without conflict.

[0064] In the following description, the term "first\second\third" is only used to distinguish similar objects, and does not represent a specific ordering of objects. Understandably, "first\second\third" Where permitted, the specific order or sequencing may be interchanged such that the embodiments of the application described herein can be practiced in sequences oth...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a network training method and device, an action recognition method and device, equipment and a computer readable storage medium. The method comprises the following steps: updating model parameters of a pre-training model by utilizing a first sequence data set of a human skeleton point sequence and a visual angle label corresponding to each piece of first sequence data in thefirst sequence data set; initializing model parameters of a human body action recognition model based on the updated model parameters of the pre-trained model; wherein the pre-training model and thehuman body action recognition model have feature extraction networks with the same structure; and updating model parameters of the human body action recognition model by utilizing the second sequencedata set of the human body skeleton point sequence and the action category label corresponding to each second sequence data in the second sequence data set to obtain a trained human body action recognition model. Through the method and the device, the action recognition precision of the human body action recognition model can be improved, the model training time can be reduced, the dependence on strong annotation data can be reduced, and the manual workload is further reduced.

Description

technical field [0001] The present application relates to computer vision technology, in particular to a method, device, device and storage medium for network training and action recognition. Background technique [0002] With the research and progress of artificial intelligence technology, artificial intelligence technology has been researched and applied in many fields, such as common smart homes, smart wearable devices, virtual assistants, smart speakers, smart marketing, unmanned driving, automatic driving, drones , robots, smart medical care, smart customer service, etc. In the application of artificial intelligence technology, the application of human action recognition technology is playing an increasingly important role, such as detecting whether the target person falls or is sick, automatic teaching of fitness, sports and dance, etc., understanding body language (such as Airport runway signals, traffic police signals, etc.), enhanced security and monitoring, etc. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/23G06N3/045G06F18/2321G06F18/2415G06F18/214
Inventor 徐飞翔黄迎松白琨
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products