Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Wi-Move behavior perception method based on CNN (Convolutional Neural Network)

A behavioral and algorithmic technology, applied in the computer field, can solve problems such as patent publications that have not yet been found

Active Publication Date: 2020-12-04
TIANJIN CHENGJIAN UNIV
View PDF8 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] Through the search, no patent publications related to the patent application of the present invention have been found

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Wi-Move behavior perception method based on CNN (Convolutional Neural Network)
  • Wi-Move behavior perception method based on CNN (Convolutional Neural Network)
  • Wi-Move behavior perception method based on CNN (Convolutional Neural Network)

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0139] Below in conjunction with specific embodiment the present invention is described in further detail, and following embodiment is descriptive only, is not limiting, can not limit protection scope of the present invention with this.

[0140] Structures not described in detail in the present invention can be understood as conventional structures in the art.

[0141] Propose a kind of Wi-Move behavior perception method based on CNN among the present invention, 9 kinds of behaviors are identified, concrete preparation and detection are as follows:

[0142] 1.1 Wi-Move data preprocessing

[0143] In the perception method, only the amplitude information of CSI is used. This is because the phase information does not have the obvious fluctuation law of the amplitude information under the influence of human behavior, so the artificially extracted phase information feature value does not have good separability. , but Wi-Move uses a deep network to extract feature information, so W...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a Wi-Move behavior perception method based on a CNN. The Wi-Move behavior perception method comprises the following steps: (1) preprocessing Wi-Move data; (2) performing humanbody behavior perception based on CNN; (3) constructing a Wi-Move input feature map; (4) designing a Wi-Move network; and (5) optimizing the Wi-Move network model. Aiming at the problem that a feature extraction and classification sensing method is not comprehensive in feature extraction and is only suitable for sensing fewer behavior types, the invention provides a CNN-based WiMove behavior sensing method, and compared with the feature extraction and classification sensing method, the CNN-based WiMove behavior sensing method has higher recognition accuracy in occasions of sensing multiple types of behaviors. Amplitude and phase information of all CSI subcarriers is hierarchically extracted by a deep CNN network, and feature information is more comprehensive.

Description

technical field [0001] The invention belongs to the technical field of computers, in particular to a CNN-based Wi-Move behavior perception method. Background technique [0002] With the rapid development of modern science and technology and the widespread popularization of computer equipment, Human Computer Interaction (HCI) has become the focus of exploration and attention of many researchers. The so-called human-computer interaction technology refers to the process of exchanging data between users and computer equipment through pre-set interaction modes such as expressions, voices, and behaviors, and enabling them to complete specified tasks. As an important research field, human behavior perception plays a vital role in human-computer interaction technology and has brought great improvements to people's production and lifestyle. [0003] From a broad perspective, human behavior perception technology mainly involves three fields, namely human behavior perception based on ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08G06K9/00H04B17/309
CPCG06N3/08H04B17/309G06N3/045G06F2218/06G06F2218/08G06F18/2415
Inventor 王燕闫博张锐郭洪飞胡斌梁婷蓉
Owner TIANJIN CHENGJIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products