Radar and multi-network fusion based pedestrian pose recognition method and system

A multi-network fusion and recognition method technology, applied in the field of pedestrian gesture recognition methods and systems, can solve the problems of low recognition accuracy, discounted recognition effect, and inability to recognize, and achieves a reduction in signal sampling rate requirements, easy implementation, and high accuracy. The effect of rate recognition

Active Publication Date: 2018-11-30
武汉雷博合创电子科技有限公司
View PDF7 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In the prior art, the posture recognition of pedestrians mostly uses an optical camera to acquire images, and then performs image recognition processing on the acquired images. In this way, due to the interference of environmental factors such as lighting conditions, weather, and smoke, especially at night, its recognition effect Greatly discounted, the recognition accuracy is not high, or even unrecognizable, and the stability is very poor
Therefore, this method can no longer meet the occasions with high requirements such as 24 hours a day

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Radar and multi-network fusion based pedestrian pose recognition method and system
  • Radar and multi-network fusion based pedestrian pose recognition method and system
  • Radar and multi-network fusion based pedestrian pose recognition method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] The principles and features of the present invention are described below in conjunction with the accompanying drawings, and the examples given are only used to explain the present invention, and are not intended to limit the scope of the present invention.

[0075] Such as figure 1 As shown, a pedestrian posture recognition method based on radar and multi-network fusion includes the following steps:

[0076] Step 1: Preprocessing the echo signal of the radar signal to obtain the output signal;

[0077] Step 2: suppressing stationary targets in the output signal;

[0078] Step 3: Searching for the distance unit of the pedestrian in the output signal after suppression processing;

[0079] Step 4: Perform time-frequency analysis on the echo signal corresponding to the distance unit where the pedestrian is located to obtain a time-frequency diagram of the echo signal;

[0080] Step 5: Using a plurality of convolutional neural networks to identify the time-frequency diagr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a radar and multi-network fusion based pedestrian pose recognition method and system. The method comprises: pre-processing an echo signal of a radar signal to obtain an outputsignal; suppressing a stationary target in the output signal; searching a pedestrian located distance unit in the suppressed output signal; performing time-frequency analysis on the echo signal corresponding to the pedestrian located distance unit, and obtaining a an echo signal time-frequency map; using a plurality of convolutional neural networks to recognize the echo signal time-frequency map,and obtaining a recognition result of each convolutional neural network; and fusing the recognition results of all the convolutional neural networks, and obtaining the fused pose recognition result.The pedestrian pose recognition method provides a pedestrian pose recognition result in real time by analyzing the radar echo, can ensure high pose recognition accuracy through multi-neural network fusion, is free from interference of factors such as illumination conditions, weather, smoke and the like, and is able to work all day and all weathers.

Description

technical field [0001] The invention relates to the technical field of radar signal processing and image recognition, in particular to a pedestrian gesture recognition method and system based on radar and multi-network fusion. Background technique [0002] In the prior art, the posture recognition of pedestrians mostly uses an optical camera to acquire images, and then performs image recognition processing on the acquired images. In this way, due to the interference of environmental factors such as lighting conditions, weather, and smoke, especially at night, its recognition effect Greatly discounted, the recognition accuracy is not high, or even unrecognizable, and the stability is very poor. Therefore, this method can no longer meet occasions with high requirements such as 24 hours a day. Contents of the invention [0003] The technical problem to be solved by the present invention is to provide a pedestrian posture recognition method and system based on radar and multi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04
CPCG06V40/103G06N3/045G06F2218/12G06F18/253
Inventor 张道明高元正龙希
Owner 武汉雷博合创电子科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products