Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Position embedding interpretation method and device, computer equipment and storage medium

A technology of given location and category, applied in the field of machine learning, can solve problems such as ignoring the interpretability of location embeddings, failing to meet the needs of business scenarios, ignoring the interpretability of models, etc., to overcome the sparseness of learning vectors and the inability to cover many semantics information, speed up model convergence, and reduce the effect of learning scale

Active Publication Date:
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At the same time, the interpretability of positional embeddings is also ignored
[0005] To sum up, the position embedding models in related technologies ignore the interpretability of the model, so that each dimension of the learned vector has no specific meaning and cannot meet the needs of business scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Position embedding interpretation method and device, computer equipment and storage medium
  • Position embedding interpretation method and device, computer equipment and storage medium
  • Position embedding interpretation method and device, computer equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0055] figure 1 It is a flow chart of a location embedding interpretation method provided by an embodiment of the present invention. This method aims at the uninterpretable position representation in the trajectory representation field, and makes each dimension of the position representation have clear and easy-to-understand semantics through the representation and learning of the model. The method includes steps S10-S40.

[0056] S10: Obtain an original data set, and perform preprocessing on the original data set, wherein the preprocessed original data set is composed of multiple location data and multiple location category data, and each location data corresponds to one location category data.

[0057] In one embodiment, the original data set is the user's check-in data set, and the preprocessing of the original data set includes: filtering out the check-in data corresponding to users with less than 20 check-in times in the original data, and simultaneously filtering out th...

Embodiment 2

[0138] Figure 4 is a schematic structural diagram of a position-embedded interpretation device provided by an embodiment of the present invention. The device is used to implement the location embedding interpretation method provided in Embodiment 1, including a data acquisition module 410 , a location embedding module 420 , a category embedding module 430 and a semantic representation module 440 .

[0139] The data acquisition module 410 is used to acquire an original data set and perform preprocessing on the original data set, wherein the preprocessed original data set is composed of a plurality of location data and a plurality of location category data, and each location data corresponds to a Location category data.

[0140] The location embedding module 420 is used to obtain multiple locations corresponding to the multiple location data, wherein each location corresponds to at least one location data; according to the multiple location context sequences in the preprocesse...

Embodiment 3

[0169] Figure 5 It is a schematic structural diagram of a computer device provided by an embodiment of the present invention. Such as Figure 5 As shown, the device includes a processor 510 and a memory 520 . The number of processors 510 may be one or more, Figure 5 A processor 510 is taken as an example.

[0170] The memory 520, as a computer-readable storage medium, can be used to store software programs, computer-executable programs and modules, such as the program instructions / modules embedded in the interpretation method in the embodiment of the present invention. The processor 510 executes the software programs, instructions and modules stored in the memory 520 to implement the above-mentioned location-embedded interpretation method.

[0171] The memory 520 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system and an application program required by at least one function; the data storage ar...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a position embedding interpretation method and device, computer equipment and a storage medium. The method comprises the following steps: acquiring an original data set, and preprocessing the original data set; acquiring a plurality of positions corresponding to the plurality of pieces of position data; according to a plurality of position context sequences in the preprocessed original data set, performing learning by using a Skip-gram model to obtain a plurality of position embedding vectors; obtaining a plurality of position categories corresponding to the plurality of pieces of position category data; performing learning by using a Skip-gram model to obtain a plurality of position category embedding vectors, and enabling the plurality of position embedding vectors and the plurality of position category embedding vectors to be located in an original vector space; and according to a predetermined rule, converting each position embedding vector from an original vector space to a semantic vector space to obtain a position semantic representation corresponding to each position embedding vector. According to the method, more semantic information can be learned, so that the vector of each dimension has interpretability.

Description

technical field [0001] The embodiments of the present invention relate to the technical field of machine learning, and in particular, to a position embedding interpretation method, device, computer equipment and storage medium. Background technique [0002] The statements in this section merely provide background information related to the present invention and do not necessarily constitute prior art. [0003] The learning of embedding models is mainly divided into two categories: the first is inspired by the success of word embedding models, and the word vector (Word2vec) framework can be used to learn position embedding (ie position embedding vector) from check-in data. Check-in sequences can be modeled and the influence of linear context is captured to learn location embeddings, which can be used for personalized place recommendations. In addition to the sequential mode, the follow-up begins to consider dynamic user preferences and the time factor of learning position re...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/35G06F40/216G06F40/30G06K9/62
CPCG06F16/35G06F40/216G06F40/30G06F18/22
Inventor 丁冬睿陈勐张凯杨光远
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products