Multi-scene indoor action recognition method based on channel state information and BiLSTM

A technology of channel state information and action recognition, which is applied in the field of human activity recognition, can solve the problems that the model cannot be used in multiple scenarios and the recognition degree is low, and achieve the effects of reducing cost and time, improving accuracy and simplicity

Pending Publication Date: 2020-08-18
NANJING UNIV OF POSTS & TELECOMM
View PDF5 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a multi-scenario indoor action recognition method based on channel state inform

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-scene indoor action recognition method based on channel state information and BiLSTM
  • Multi-scene indoor action recognition method based on channel state information and BiLSTM

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0036] The technical solution of the present invention will be further described in detail below in conjunction with the accompanying drawings of the specification.

[0037] A multi-scene indoor action recognition method based on channel state information and BiLSTM, which specifically includes the following steps:

[0038] S1: Receive Wi-Fi signals when multiple indoor scenes perform indoor sports, and extract channel state information CSI data from the Wi-Fi signals.

[0039] Place the Wi-Fi transmitter and receiver in the room where motion recognition is required, connect the receiver to the PC used for collection, and install the corresponding collection software; turn on the transmitter, receiver and collection PC, and start with the collection software Receive Wi-Fi signals and extract channel state information CSI data; while collecting, tag the CSI data that needs to be used as a training set with corresponding action labels. For example, using 3 transmitting antennas and 3 ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a multi-scene indoor action recognition method based on channel state information (CSI) and BiLSTM, and the method comprises the steps: collecting Wi-Fi signals of indoor movement in a plurality of indoor scenes, and extracting the data of the channel state information (CSI); performing preprocessing such as low-pass filtering and normalization on the CSI data, and respectively dividing the data of the plurality of scenes into a training set and a test set; inputting the original scene training set into a deep neural network based on BiLSTM to train an original scene action recognition model; classifying the collected original scene CSI test set data by using the learned original scene model to achieve the purpose of indoor action recognition; and obtaining an indoor action recognition model suitable for the new scene by using a transfer learning mechanism, and performing action recognition in the new scene. According to the method, the time sequence features in the CSI are extracted by adopting the deep neural network based on BiLSTM, action recognition is carried out, multi-scene adaptation is realized by utilizing transfer learning, and the method has relatively good accuracy, universality and robustness.

Description

technical field [0001] The invention relates to the technical field of human activity recognition, in particular to a multi-scene indoor action recognition method based on channel state information and BiLSTM. Background technique [0002] In the process of interaction between human and computer, in order to make it easier for the computer to understand people's intentions, the detection and recognition of human activities and behaviors is particularly important. However, traditional human activity recognition technologies, including wearable device-based and vision-based recognition methods, have some limitations and drawbacks, are invasive to a certain extent, and require high costs. [0003] Since Wi-Fi devices have the characteristics of wide distribution, low cost, and easy deployment, only terminal devices with wireless network cards such as notebooks are required, and the detection and identification of human activities using Wi-Fi signals is passive. There is no nee...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04W4/33H04L1/00G06N3/08G06N3/04
CPCH04W4/33H04L1/0026G06N3/049G06N3/08G06N3/045
Inventor 肖甫方垣闰盛碧云周剑周颖陈霄
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products