Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Multi-level semantic feature extraction method for behavior data of intelligent wearable equipment

A wearable device and semantic feature technology, applied in the field of mobile perception, can solve the problems that the method of understanding and analyzing user behavior data cannot meet the needs, the network cannot understand, and the practicality is limited, so as to reduce labor costs, high recognition accuracy, The effect of improving accuracy

Active Publication Date: 2017-04-26
WUXI TSINGHUA NAT LAB FOR INFORMATIONSCI & TECH INTERNET OF THINGS TECH CENT
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the process of labeling behaviors requires a lot of manpower and material resources, and the trained network cannot understand unlabeled behaviors. This method is difficult to be applied to real-world scenarios.
[0004] Existing methods that can provide high recognition accuracy largely require researchers to have prior knowledge of usage scenarios or specific behavior types of smart wearable devices, which limits their practicality in real life
In fact, the use environment, equipment, and behavior of wearable device users are highly arbitrary, and manual labeling cannot enumerate all possible situations
With the penetration of wearable devices into people's daily life, the existing methods of understanding and analyzing user behavior data are far from meeting the needs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-level semantic feature extraction method for behavior data of intelligent wearable equipment
  • Multi-level semantic feature extraction method for behavior data of intelligent wearable equipment
  • Multi-level semantic feature extraction method for behavior data of intelligent wearable equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] The present invention will be further described below in conjunction with drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, but not to limit the present invention. In addition, it should be noted that, for the convenience of description, only some parts related to the present invention are shown in the accompanying drawings but not the whole content. Unless otherwise defined, all technical and scientific terms used herein are related to the technical field of the present invention. The skilled person generally understands the same meaning. The terms used herein are for describing specific embodiments only, and are not intended to limit the present invention.

[0030] Please refer to figure 1 as shown, figure 1 It is a flowchart of a multi-level semantic feature extraction method for smart wearable device behavior data provided by an embodiment of the present invention.

[00...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-level semantic feature extraction method for behavior data of intelligent wearable equipment, and the method comprises the following steps: S101, constructing a single-level behavior space, and finding a group of bases in the behavior space; S102, constructing a multi-level behavior space, and analyzing the behavior data from different granularities; S103, extracting the multi-level semantic features of behavior data in the multi-level behavior space. The method does not need to manually mark the data, and can be suitable for any behavior while greatly reducing the manual cost. Meanwhile, the extracted semantic features can be used for analyzing the behaviors from different granularities, thereby guaranteeing the high recognition precision. Compared with a conventional method based on a pre-defined semantic feature space, the method greatly improves the accuracy. Compared with a conventional method based on a supervised depth neural network, the method can provide higher recognition precision.

Description

technical field [0001] The invention relates to the field of mobile perception, in particular to a multi-level semantic feature extraction method for behavior data of smart wearable devices. Background technique [0002] In recent years, due to the increasing maturity of communication technology, sensor technology and embedded computing technology, the market for smart wearable devices such as smart watches and bracelets has developed rapidly. Compared with smartphones, smart wearable devices are closer to the daily life of users and are easier to be worn by users for a long time. Therefore, the analysis and understanding of user behavior data through smart wearable devices will greatly promote the development of smart medical care, smart economy, insurance and other fields. However, the existing applications are still at a relatively low level: most of the applications in the application market, such as WeChat Sports, Gudong Running, etc., can only record the user's steps ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 刘慈航张兰刘宗前刘克彬李向阳刘云浩
Owner WUXI TSINGHUA NAT LAB FOR INFORMATIONSCI & TECH INTERNET OF THINGS TECH CENT
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products