Neural network-based multi-modal complementary clothing collocation method, system and medium

A neural network and multi-modal technology, applied in biological neural network models, data processing applications, sales/lease transactions, etc., can solve problems such as inability to model compatibility of items

Active Publication Date: 2020-05-12
SHANDONG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method usually only considers the visual information of items and cannot comprehensively model the compatibility between items.
In addition, there is still a problem of sparsity in the process of clothing matching

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network-based multi-modal complementary clothing collocation method, system and medium
  • Neural network-based multi-modal complementary clothing collocation method, system and medium
  • Neural network-based multi-modal complementary clothing collocation method, system and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0065] It should be pointed out that the following detailed description is exemplary and intended to provide further explanation to the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.

[0066] It should be noted that the terminology used here is only for describing specific implementations, and is not intended to limit the exemplary implementations according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combinations thereof.

[0067] It mainly includes the following contents:

[0068] Mining multi-modal information of commodities (ie, v...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-modal complementary clothes matching method and system based on a neural network and a medium. The method comprises a step of obtaining visual features from pictures ofclothes and obtaining text features from text descriptions of the clothes, a step of learning the compatibility space of the visual features and the text features of different clothes by using a self-encoder and obtaining implicit representation of the visual features and implicit representation of the text features, a step of establishing a relationship model between the reconstruction vectors and an input feature, a step of establishing a clothes compatibility model, a step of constructing a compatibility preference model by using a Bayesian personalized sorting algorithm based on the clothes compatibility model, a step of establishing a consistency model of the visual feature implicit representation and the text feature implicit representation, a step of establishing a multi-modal implicit feature consistency model of the clothes, a step of constructing a multi-modal complementary clothes matching model based on a deep neural network, a step of training the constructed multi-modalcomplementary clothes matching model, and a step of performing clothes matching recommendation by using the trained multi-modal complementary clothes matching model.

Description

technical field [0001] The invention relates to a neural network-based multimodal complementary clothing collocation method, system and medium. Background technique [0002] Nowadays, in addition to the physiological needs for clothing, more and more people are beginning to pay attention to and pursue fashion, elegance, decent and so on. However, not everyone has great taste in matching clothes. When faced with a large number of clothing products, many people will find it very difficult and boring to match clothing. Therefore, we develop an effective clothing matching scheme to help people find a coordinated and stylish combination for a given outfit. The current clothing collocation technology mainly includes methods based on collaborative filtering and methods based on content. Among them, the former makes recommendations based on historical behaviors of users with similar tastes and preferences, such as: users' purchase behaviors, users' text descriptions of commoditie...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06Q30/06G06N3/02
CPCG06N3/02G06Q30/0631G06Q30/0643
Inventor 刘金环宋雪萌马军甘甜聂礼强
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products