Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human behavior identification method based on mobile equipment

A mobile device and recognition method technology, applied in character and pattern recognition, instruments, computer parts, etc., can solve problems such as poor accuracy and poor generality of human behavior recognition methods

Active Publication Date: 2016-06-15
ZHEJIANG UNIV
View PDF4 Cites 30 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] In order to overcome the shortcomings of poor versatility and poor accuracy of existing human behavior recognition methods, the present invention provides a mobile device-based human behavior recognition method with good versatility and high accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human behavior identification method based on mobile equipment
  • Human behavior identification method based on mobile equipment
  • Human behavior identification method based on mobile equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] The present invention will be further described below in conjunction with the accompanying drawings.

[0045] refer to Figure 1 ~ Figure 3 , a mobile device-based human behavior recognition method, comprising the following steps:

[0046] Step (1), training device location classification model c p and a behavioral classification model based on different device locations ca i ,ca i ∈C, C is a collection of behavior classification models, C={ca 1 ,ca 2 ,...,ca I}, behavior classification model ca i with device position p i has a one-to-one correspondence, the device position p i ∈P, P is a set of predefined device positions, P={p 1 ,p 2 ,...,p I}, I is the number of predefined device location categories;

[0047] Step (2), collecting raw sensor data in real time through the built-in sensor of the mobile device;

[0048] Step (3), performing data preprocessing on the data collected in real time by the built-in sensor of the mobile device, to obtain the data s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human behavior identification method based on mobile equipment. The human behavior identification method comprises the steps of: acquiring data in real time by utilizing various kinds of sensors arranged inside the mobile equipment; carrying out a series of data preprocessing operations such as correction, filtering, data calculation and generation as well as data segmentation on the data obtained by the sensors; extracting features of the preprocessed data, inputting extracted corresponding feature vectors into an equipment position classification model to obtain equipment position classes; and selecting corresponding behavior classification models according to the obtained equipment position classes, and inputting the extracted corresponding feature vectors into the behavior classification models to obtain a final behavior recognition result. The human behavior identification method based on the mobile equipment provided by the invention has good universality and high accuracy.

Description

technical field [0001] The invention relates to the technical field of behavior recognition, in particular to a mobile device-based human behavior recognition method. Background technique [0002] Human behavior recognition is a technology that judges the state of human behavior by acquiring and analyzing data related to human behavior. By learning the basic behavioral activities of the human body, this technology can provide information for sports tracking, health monitoring, fall detection, elderly monitoring, patient recovery training, complex behavior recognition, auxiliary industrial manufacturing, human-computer interaction, augmented reality, indoor positioning and navigation, personal characteristics The research and application in many fields such as identification and urbanization calculation provide information about the human body, so it has important application value and research significance. [0003] Traditional human behavior recognition technology is mainl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/23G06F2218/00G06F18/2411G06F18/2451G06F18/24323
Inventor 潘赟茹晨光朱永光朱怀宇
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products