Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Cross-scene human body action recognition method based on adversarial meta-learning

A human action recognition and human action technology, applied in the fields of wireless network, human action recognition and deep learning, can solve the problems of large fluctuation of CSI action signal, large repetitive workload of action recognition, large manpower and material cost, etc. Classification effect, improve identification ability, improve the effect of accuracy

Active Publication Date: 2021-11-12
HEFEI UNIV OF TECH
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this method is very impractical, because this calibration process will cost a lot of manpower and material resources
Some plants deploy fixed hardware to get new CSI for modification, but additional hardware implementation incurs additional cost
In addition, another challenge to consider is that when a human recognition system trained in a known environment is deployed to a new scene to recognize the actions of a new user, the recognition accuracy will drop significantly
This is because people usually repeat the same action at different speeds in different environments, which leads to excessive volatility of the same CSI action signal, which cannot be recognized by the system
The system must be retrained for different scenarios, which will bring a considerable amount of repetitive work to action recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Cross-scene human body action recognition method based on adversarial meta-learning
  • Cross-scene human body action recognition method based on adversarial meta-learning
  • Cross-scene human body action recognition method based on adversarial meta-learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] In this embodiment, refer to figure 1 , a cross-scene human action recognition method based on adversarial meta-learning is carried out as follows:

[0043] Step 1. Select two rooms with different indoor environment layouts as scene 1 and scene 2, and deploy a pair of WIFI transceivers respectively:

[0044] The WIFI signal sending device in the WIFI transceiver device is a router with a root antenna, denoted as AP, and the receiving device of WIFI signal is a wireless network card with b antennas, denoted as RP, and the distance between the router AP and the wireless network card RP The distance is l, so that in scenario 1 and scenario 2, a×b antenna pairs are respectively formed for sending and receiving radio signals, and each antenna pair has z available subcarriers;

[0045] In this example, if figure 2 , image 3 As shown, WIFI devices are deployed in fixed positions at intervals of 2.6 meters in the rooms in Scenario 1 and Scenario 2: TL-WDR6500 router with 2...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a cross-scene human body action recognition method based on adversarial meta-learning, which comprises the following steps of: firstly, acquiring a human body action CSI (Channel State Information) signal by using wireless signal receiving and transmitting equipment, then processing an original signal by adopting a discrete wavelet filtering technology, obtaining a CSI sample by using a threshold segmentation method, and constructing a meta-learning task set; sequentially training a feature extractor module, a generator module, a discriminator module and a human body action recognition module by using the task set to obtain a basic adversarial meta-model; and finally, collecting a small amount of data in a new scene to finely adjust model parameters, so that the model parameters can better adapt to recognition of new action types. According to the invention, high recognition accuracy of new action types can be obtained with low cost when scenes, users and the like change.

Description

technical field [0001] The invention relates to the fields of wireless network, human action recognition and deep learning, in particular to an adversarial meta-learning method based on few samples. Background technique [0002] In recent years, with the development of Internet of Things technology, human motion recognition system based on radio technology has been widely used in smart home, health monitoring and other fields, so it has attracted more and more attention. Existing methods can be divided into WIFI, ultra-wideband (UWB) and radio frequency identification tags (RFID) and so on. Among them, UWB technology requires the deployment of a large number of devices to receive and process signals, and the need to modify the site, resulting in high costs; and the realization of human body recognition based on RFID tags, mobile phones or wearable devices also requires the target to carry equipment. At present, the WIFI-based human motion recognition system is becoming a po...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08H04W4/33H04W84/12H04W88/14
CPCG06N3/08H04W4/33H04W84/12H04W88/14G06N3/047G06N3/048G06N3/045G06F2218/06G06F2218/08G06F2218/12G06F18/2415G06F18/214Y02D30/70
Inventor 王昱洁姚路王英
Owner HEFEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products