Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human movement mode speculation model based on variation track context perception, training method and speculation method

A technology of moving patterns and training methods, applied in computing models, character and pattern recognition, instruments, etc.

Active Publication Date: 2019-08-20
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF5 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the Trajectory Context Learning (TCL) problem that has not been formally defined in the existing semantic trajectory mining, thus proposing a method for trajectory context inference according to the semantic dimension of the trajectory, and using The proposed method addresses the problem of human mobility pattern prediction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human movement mode speculation model based on variation track context perception, training method and speculation method
  • Human movement mode speculation model based on variation track context perception, training method and speculation method
  • Human movement mode speculation model based on variation track context perception, training method and speculation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0061] This embodiment provides a human movement pattern estimation model based on variational trajectory context perception, such as figure 1 As shown, it includes data preprocessing module, recurrent trajectory encoder, variational trajectory encoder, variational attention layer and decoder. The data preprocessing module is used to obtain the embedding vector of each track point of the current track. The loop trajectory encoder is used to encode the input current trajectory embedding vector to obtain the current trajectory semantic vector. The variational trajectory encoder is used to learn the input current trajectory embedding vector, and obtain the variational latent variable of the current trajectory satisfying the Gaussian distribution. The variational attention layer is used to obtain the attention vector of the current trajectory according to the semantic vector of the current trajectory based on the variational attention mechanism, and then cascade the attention vec...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human movement mode speculation model, a training method and a speculation method based on variation track context perception. The method comprises the following steps: firstly, respectively obtaining a track semantic vector and a variation hidden variable through a circular track encoder and a variation track encoder; and obtaining an attention vector of the track basedon a variational attention mechanism, cascading the attention vector with the variational hidden variable so as to reconstruct input data of a decoder, and finally restoring a previous track and generating a prediction track according to an output semantic vector of the decoder. According to the invention, the frame of the encoder-decoder solves the problem of track context learning. Two sub-tasks, namely track recovery and track prediction, speculated by a human movement mode are completed; not only can the probability density be estimated and the lower limit of the data possibility be optimized, but also the sequence and time characteristics of human mobility can be captured, the problem of track speculation according to track context perception is effectively solved, and the effect is improved for speculation of a human movement mode.

Description

technical field [0001] The invention belongs to the field of deep learning in machine learning, and relates to a human movement pattern estimation technology based on variational trajectory context perception, which mainly utilizes deep learning to perform large-scale analysis on Location-Based Social Networks (LBSN) based on geographic location information. In-depth mining of large-scale trajectory data, mobility semantic learning at the trajectory level to achieve end-to-end prediction, thereby improving the prediction effect of human mobility patterns. Background technique [0002] Over the past decade, interest in mining human mobility patterns for location-based services-based social networks has grown rapidly. The availability of large amounts of LBSN data facilitates research on user behavior and mobility patterns, such as Point of Interest (POI) recommendation, travel planning, and various privacy protection issues, etc. [0003] Most of the existing research focuse...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N20/00G06K9/62
CPCG06N20/00G06F18/214
Inventor 钟婷周帆岳晓丽
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products