Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Smart device-based radar system for performing gesture recognition using spatio-temporal neural network

A technology of radar system and neural network, applied in the direction of biological neural network model, neural architecture, neural learning method, etc., to achieve the effect of saving power, increasing size, saving power and memory

Pending Publication Date: 2022-04-15
GOOGLE LLC
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Spatiotemporal neural networks are implemented using a multi-stage machine learning architecture

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Smart device-based radar system for performing gesture recognition using spatio-temporal neural network
  • Smart device-based radar system for performing gesture recognition using spatio-temporal neural network
  • Smart device-based radar system for performing gesture recognition using spatio-temporal neural network

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0132] Example 1: A method performed by a radar system comprising:

[0133] transmitting a radar transmission using an antenna array of the radar system;

[0134] receiving a radar receive signal comprising a version of a radar transmit signal reflected by at least one user using the antenna array;

[0135] generating composite radar data based on the radar received signal;

[0136] providing the composite radar data to a spatio-temporal neural network of the radar system, the spatio-temporal neural network comprising a multi-stage machine learning architecture; and

[0137] The composite radar data is analyzed using the spatio-temporal neural network to identify gestures performed by the at least one user.

example 2

[0138] Example 2: The method of Example 1, wherein:

[0139] Analysis of the composite radar data includes analyzing both magnitude and phase information of the composite radar data using machine learning techniques to identify the attitude.

example 3

[0140] Example 3: The method of Example 1 or 2, wherein:

[0141] The multi-stage machine learning architecture includes a spatially recurrent network and a temporally recurrent network; and

[0142] Wherein, the analysis to described compound radar data comprises:

[0143] analyzing the composite radar data in the spatial domain using the spatial recurrent network to generate feature data associated with the pose; and

[0144] The feature data is analyzed in the temporal domain using the temporal recurrent network to identify the pose.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Techniques and apparatus are described that implement a smart device-based radar system capable of performing gesture recognition using a spatiotemporal neural network (222). The spatiotemporal neural network (222) employs machine learning to identify a pose of the user based on the composite radar data. The spatiotemporal neural network (222) is implemented using a multi-level machine learning architecture, which enables the radar system (102) to save power and identify a user's pose in real time (e.g., when the pose is performed). The spatiotemporal neural network (222) is also adaptable and may be extended to identify multiple types of gestures, such as a swipe gesture and a stretch gesture, without significantly increasing size, computational requirements, or latency.

Description

Background technique [0001] Radar is a useful device that can detect objects. Radar can provide improved performance relative to other types of sensors, such as cameras, in the presence of different environmental conditions, such as low lighting and fog, or with moving or overlapping objects. Radar can also detect objects through one or more obscuring objects, such as purses or pockets. While radar has many advantages, there are many challenges associated with integrating radar in electronic devices. [0002] One challenge involves power constraints within small or mobile electronic devices. The operation of some radars significantly drains the battery of the electronic device and causes the user to frequently recharge the electronic device. Thus, the benefits of utilizing radar may not be realized in situations where effective operation of the radar is curtailed or disabled due to limitations in available power. [0003] Another challenge concerns the constraints that sma...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01S13/34G01S13/58G01S13/88G01S7/35G01S7/41G01S7/539G06F3/01G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06F3/017G01S13/88G06F3/014G01S7/417G01S7/539G01S7/4802G01S13/345G01S13/584G01S7/352G01S7/356G06N3/08G06V40/28G06V10/94G06V10/82G06N3/044G06N3/045G06F2218/08G06F2218/12G06F18/24133G01S13/06G06F3/011
Inventor 米卡尔·马图兹克阿贝尔·塞勒什·门格斯图尼古拉斯·吉利恩阿比吉特·阿隆·沙阿
Owner GOOGLE LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products