Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gait recognition and emotion perception method and system based on intelligent acoustic equipment

An acoustic device and gait recognition technology, applied in the field of situational awareness, can solve the problems of inconsistent living conditions, inconvenient gait of pedestrians, leakage of personal privacy, etc.

Pending Publication Date: 2020-09-11
SHENZHEN UNIV
View PDF5 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

First, the gait authentication and emotion recognition system based on camera images will more or less capture the pedestrian’s facial information, or external information such as clothing, which will lead to personal privacy leakage, while user privacy protection in the home scene In recent years, more and more attention has been paid to
Second, the gait authentication and emotion recognition system based on wearable devices requires the user to wear or carry a data collection device, which increases the inconvenience of the user, especially the system based on VICON equipment, which needs to be installed on the user. The key acquisition nodes of the sensor will cause inconvenience to the pedestrian's gait, and it is also inconsistent with the user's daily living conditions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gait recognition and emotion perception method and system based on intelligent acoustic equipment
  • Gait recognition and emotion perception method and system based on intelligent acoustic equipment
  • Gait recognition and emotion perception method and system based on intelligent acoustic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that the relative arrangements of components and steps, numerical expressions and numerical values ​​set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.

[0045] The following description of at least one exemplary embodiment is merely illustrative in nature and in no way taken as limiting the invention, its application or uses.

[0046] Techniques, methods and devices known to those of ordinary skill in the relevant art may not be discussed in detail, but where appropriate, such techniques, methods and devices should be considered part of the description.

[0047] In all examples shown and discussed herein, any specific values ​​should be construed as exemplary only, and not as limitations. Therefore, other instances of the exemplary embodiment may have dif...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a gait recognition and emotion perception method and system based on intelligent acoustic equipment. The method comprises the following steps of: taking a loudspeaker as a wavesource for sending signals, and utilizing a microphone to collect a signal reflected by a target pedestrian to obtain voice frequency data; processing the voice frequency data to obtain a corresponding energy atlas, and segmenting a signal containing a gait event; for the signal containing the gait event, independently extracting macroscopic gait characteristics, microcosmic gait characteristics and the embedding representation characteristics of various neural networks; and taking the blending of the characteristic vectors of the macroscopic gait characteristics, microcosmic gait characteristics and the embedding representation characteristics of various neural networks as input, and utilizing a trained classifier to obtain thee emotion classification result of the target pedestrian. Themethod can be applied to the intelligent home environment, and can carry out emotion recognition under a situation that individual privacy is not invaded and a user is not required to carry additionalequipment.

Description

technical field [0001] The present invention relates to the technical field of situation perception, and more specifically, to a method and system for gait recognition and emotion perception based on intelligent acoustic equipment. Background technique [0002] With the popularity of IoT smart devices, natural human-computer interaction in smart home scenarios is becoming more and more important. Due to the development of smart homes, the emotional computing of the Internet of Things has the needs of realistic application scenarios, for example, to study the emotions of pedestrians through human gait, so as to obtain better human-computer interaction. [0003] In the prior art, gait recognition and emotion perception are usually performed using camera-related walking data or using wearable device-related walking data. [0004] For example, use the Kinect camera to collect gait data of human joint nodes, analyze and classify the gait data with emotional labels, and distingui...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): A61B5/16A61B5/11A61B5/00
CPCA61B5/112A61B5/1126A61B5/165A61B5/4803A61B5/7203A61B5/7225A61B5/7235A61B5/7267
Inventor 邹永攀洪史聪伍楷舜刘金源潘子健
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products