Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Equipment-free personnel action recognition and position estimation method based on multi-task learning

A multi-task learning and action recognition technology, applied in the field of action recognition and position estimation of personnel without equipment, can solve the problems of increasing positioning errors, and achieve the effects of improving estimation performance, improving practicability and convenience

Pending Publication Date: 2021-12-21
NANJING UNIV OF POSTS & TELECOMM
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

And the position estimation problem is transformed into classification learning, which will lead to an increase in localization error when the target is located between the reference points

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Equipment-free personnel action recognition and position estimation method based on multi-task learning
  • Equipment-free personnel action recognition and position estimation method based on multi-task learning
  • Equipment-free personnel action recognition and position estimation method based on multi-task learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0040] Objects, advantages and features of the present invention will be illustrated and explained by the following non-limiting description of preferred embodiments. These embodiments are only typical examples of applying the technical solutions of the present invention, and all technical solutions formed by adopting equivalent replacements or equivalent transformations fall within the protection scope of the present invention.

[0041] The present invention discloses a method for action recognition and position estimation of personnel without equipment based on multi-task learning. Aiming at the defects existing in the prior art, a method for action recognition and position estimation of personnel without equipment based on multi-task learning is proposed. Not only the positioning accuracy and action recognition rate are high, but also the structure is simple and the implementation cost is low.

[0042] A non-device human action recognition and position estimation method bas...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an equipment-free person action recognition and position estimation method based on multi-task learning, and the method comprises an offline stage and a present stage, in the offline stage, an action label, an X-axis label and a Y-axis label are added to each CSI image to form a training set, then the training set is sent to a multi-task network for learning training, and a model is stored; in the online stage, the obtained CSI image is sent into the trained multi-task network model, and action recognition and position estimation are carried out. A multi-task deep neural network of a hard parameter sharing mechanism is utilized, only the correlativity of network parameters is researched in a backbone network, the correlativity and effect between tasks are emphasized, and the specificity of the tasks is ignored. And in the branch network, the structures are mutually independent, and the specificity of each task is reserved.

Description

technical field [0001] The invention relates to a multi-task learning-based non-equipment personnel action recognition and position estimation method, which can be used in the technical field of positioning and navigation. Background technique [0002] In terms of target state estimation, it is mainly to recognize the target's action and posture changes. Nowadays, there are many ways to detect and perceive these changes, such as wireless signals, wearable smart devices, image signals and video signals. Among them, most of the methods that require wearable devices use integrated devices with built-in wireless sensors. Although these devices are relatively cheap, they are inconvenient to wear on some occasions. However, the method of relying on images and videos has dead spots in shooting, and the shooting of images and videos is also easily troubled by light problems, which affects the accuracy of the final judgment. In addition, the methods of images and videos are also like...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/214G06F18/241Y02D30/70
Inventor 颜俊万凌鹏曹艳华
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products