Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network recognition model training method and device, server and storage medium

A technology for neural network recognition and model training

Active Publication Date: 2020-07-10
SHENZHEN UNIV
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, visual data is greatly disturbed by the environment, such as occlusion of the target, light intensity, etc., and is not suitable for more private scenes, so many scholars also use radar sensors to detect
[0003] However, there has been no major breakthrough in radar-based human action recognition. This is largely due to the fact that there is currently no large-scale radar database for human action recognition on the Internet, so that radar data samples need to be obtained by researchers. The data needs the assistance of prior information to complete the labeling work, which makes it impossible for radar sensors to collect data unattended like visual sensors, which is time-consuming and laborious, and limits many radar-based research work

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network recognition model training method and device, server and storage medium
  • Neural network recognition model training method and device, server and storage medium
  • Neural network recognition model training method and device, server and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] figure 1 It is a flow chart of the neural network recognition model training method provided by Embodiment 1 of the present invention. This embodiment is applicable to the neural network recognition model training situation. The method specifically includes the following steps:

[0033] S110. Obtain human body skeleton data collected by the visual sensor for the human body to be detected;

[0034] In this embodiment, the visual sensor refers to an instrument that uses optical elements and imaging devices to obtain image information of the external environment, and image resolution is usually used to describe the performance of the visual sensor. The visual sensor of this embodiment, that is, Kinect V2, not only uses optical elements, but also utilizes depth sensors, infrared emitters, etc. to obtain depth information. Exemplarily, the KinectV2 sensor is a 3D somatosensory camera, and it introduces functions such as real-time motion capture, image recognition, microphon...

Embodiment 2

[0045] figure 2 It is a flow chart of the neural network recognition model training method provided by Embodiment 2 of the present invention. This embodiment is further optimized on the basis of the above embodiments. The method specifically includes:

[0046] S210. Acquire human skeleton data collected by the visual sensor from the human body to be detected;

[0047] In this embodiment, the visual sensor refers to an instrument that uses optical elements and imaging devices to obtain image information of the external environment, and image resolution is usually used to describe the performance of the visual sensor. The visual sensor of this embodiment, that is, Kinect V2, not only uses optical elements, but also utilizes depth sensors, infrared emitters, etc. to obtain depth information. It introduces functions such as real-time motion capture, image recognition, microphone input, voice recognition, social interaction, and skeleton tracking. The computer can use vision tec...

Embodiment 3

[0071] Figure 4 Shown is a schematic structural diagram of the neural network recognition model training device 300 provided by the third embodiment of the present invention. This embodiment is applicable to the training of the neural network recognition model, and the specific structure is as follows:

[0072] Human skeleton data acquisition module 310, used to obtain the human skeleton data collected by the human body to be detected by the visual sensor;

[0073] The simulation data generating module 320 is used to combine the human skeleton data with the first model to generate the simulated radar data of the human body to be detected;

[0074] A recognition model training module 330, configured to use the simulated radar data to train a second model to obtain a neural network recognition model;

[0075] The measured data acquisition module 340, configured to acquire the measured radar data of the human body to be detected by the radar sensor;

[0076] The recognition mo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a neural network recognition model training method and device, a server and a storage medium. The method comprises the steps: obtaining human skeleton data collected by a visual sensor for a to-be-detected human body; combining the human skeleton data with a first model to generate simulation radar data of the to-be-detected human body; training a second model by using thesimulation radar data to obtain a neural network identification model; acquiring actually measured radar data of a radar sensor for the to-be-detected human body; and inputting the actual measurementradar data into the neural network identification model for verification. According to the technical scheme, the effect of expanding the radar database is achieved.

Description

technical field [0001] Embodiments of the present invention relate to radar recognition technology, and in particular to a neural network recognition model training method, device, server and storage medium. Background technique [0002] Human action recognition has always been one of the research hotspots, especially the vision-based human action recognition has become more and more mature in recent years due to the rise of deep learning. Visual data is intuitive and easy to understand, and there are many ready-made public databases and many application scenarios. However, visual data is highly disturbed by the environment, such as occluded targets, light intensity, etc., and is not suitable for more private scenes. Therefore, many scholars also use radar sensors for detection. [0003] However, there has been no major breakthrough in radar-based human action recognition. This is largely due to the fact that there is no large-scale radar database for human action recogniti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06N3/04G06N3/08G01S7/41
CPCG06N3/08G01S7/417G01S7/415G06V40/20G06N3/045Y02A90/10
Inventor 阳召成刘海帆赖佳磊
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products