Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Pedestrian motion mode recognition method based on deep hybrid model

A motion mode and hybrid model technology, applied in character and pattern recognition, biological neural network models, instruments, etc., to achieve the effects of reducing the amount of calculation, improving recognition accuracy, and high reliability

Inactive Publication Date: 2020-01-10
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF5 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the above-mentioned deficiencies in the prior art, a pedestrian motion pattern recognition method based on a deep hybrid model provided by the present invention solves the disadvantages of traditional manual extraction of pedestrian motion features, fully taps the rich feature data information contained in different signals, and reduces The problem of loss of original feature information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pedestrian motion mode recognition method based on deep hybrid model
  • Pedestrian motion mode recognition method based on deep hybrid model
  • Pedestrian motion mode recognition method based on deep hybrid model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0058] Such as figure 1 As shown, the present invention provides a pedestrian movement pattern recognition method based on a deep hybrid model. Such as figure 2 As shown, the network in the present invention is composed of a convolutional neural network CNN combined with a hybrid model of XGBoost. The present invention uses the convolutional neural network CNN as a trainable feature extractor that can automatically obtain features from the input, and uses the learning model XGBoost as a network The top-level recognizer is used to generate results, which effectively guarantees high reliability of feature extraction and classification. Based on the built-in acceleration sensor, gyroscope and magnetometer of the smart phone, the present invention adopts the deep learning framework Convolutional Neural Network (CNN) to mine rich and high-quality features in the input information, and then uses Principal Component Analysis (PCA) to analyze the proposed The features are reduced i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a pedestrian motion mode recognition method based on a deep hybrid model. The method comprises: acquiring data of four mobile phone placement positions and seven daily pedestrian motion modes by using an acceleration sensor, a gyroscope and a magnetometer which are arranged in the smart phone; automatically extracting features by adopting a convolutional neural network (CNN), then performing dimension reduction processing on the extracted features through a principal component analysis method, and inputting a processed result into a hybrid model for recognition in an XGBoost learning mode. According to the method, the CNN serves as a trainable feature extractor capable of automatically obtaining features from input, PCA is used for reducing the dimension of high-dimensional feature data to reduce the calculated amount, the XGBoost serves as a recognizer on the top layer of the network to output a result, and high reliability of feature extraction and classification is effectively guaranteed.

Description

technical field [0001] The invention belongs to the technical field of inertial navigation motion pattern recognition, in particular to a pedestrian motion pattern recognition method based on a deep hybrid model. Background technique [0002] In recent years, with the continuous development and maturity of MEMS manufacturing technology, low-cost, small-size, and high-sensitivity sensors have been used in a large number of electronic devices, such as smart phones, personal laptops, etc., which makes acquiring and analyzing data change. more and more convenient and flexible. At the same time, due to the popularity of mobile phones, people have put forward higher requirements for their intelligence. Among them, the pedestrian movement pattern recognition technology based on the built-in sensors of mobile phones has attracted widespread attention in the fields of indoor positioning, health monitoring, and smart cities. Therefore, how to realize high-precision pedestrian motion...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06N3/045G06F18/2135
Inventor 肖卓凌朱然宋儒君
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products