Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fashion compatibility analysis method and system based on deep multi-modal feature fusion

A technology of feature fusion and analysis method, applied in the field of computer vision and image applications, can solve the problem of less research on multi-modal information fusion of fashion items, achieve a broad promotion space and use value, improve accuracy, Reasonable matching effect

Pending Publication Date: 2022-07-15
GUANGXI UNIVERSITY OF FINANCE AND ECONOMICS
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the existing research on fashion compatibility, there is less research on the fusion of multi-modal information of fashion items

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fashion compatibility analysis method and system based on deep multi-modal feature fusion
  • Fashion compatibility analysis method and system based on deep multi-modal feature fusion
  • Fashion compatibility analysis method and system based on deep multi-modal feature fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] In order to make the above objects, features and advantages of the present application more clearly understood, the present application will be described in further detail below with reference to the accompanying drawings and specific embodiments.

[0047] like figure 1 As shown, a fashion compatibility analysis method based on deep multimodal feature fusion, the method includes:

[0048]Sample feature extraction network, the methods used are Resnet-18-based visual feature extraction network to extract visual features and one-hot encoding-based text feature extraction network to process text data;

[0049] After the features are extracted, the features are fused. The methods used are the visual feature and text feature fusion network based on the attention mechanism, and the extracted visual features and text features are fused and the visual feature self-attention network based on the attention mechanism to strengthen the visual modality. characteristic expression;

...

Embodiment 2

[0075] like figure 2 As shown, the fashion compatibility analysis of the clothing image to be tested is widely used in the field of computer vision and graphics. Expression, different angles contain different fashion item information, and the fusion of different fashion item information ensures the integrity of the item features. For different categories of clothing items, through sample feature extraction network, visual feature extraction network based on Resnet-18, visual feature extraction and text feature extraction network based on one-hot encoding, visual feature and text feature fusion network based on attention mechanism , strengthen the feature expression of the visual modality of fashion items, and use the calculation network based on the compatibility of fusion features to shorten the positive pair distance of fusion features in the multimodal vector space and expand the negative pair distance. The present invention can reasonably match single products, Improve t...

Embodiment 3

[0103] Embodiment 3 In order to make the above objects, features and advantages of the present application more obvious and easy to understand, the present application will be described in further detail below with reference to the accompanying drawings and specific embodiments.

[0104] like image 3 As shown, a fashion compatibility analysis system based on deep multimodal feature fusion includes: acquisition module, sample feature extraction module, modal feature expression module, multi-layer mapping feature module and multimodal feature fusion compatibility analysis module ;

[0105] The acquisition module is used to collect a sample set of data to be tested;

[0106] The sample feature extraction module is used to perform sample feature extraction network training based on the data sample set to be tested, and obtain the sample features of the data to be tested in the data sample set to be tested;

[0107] The modal feature expression module is used to perform feature ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fashion compatibility analysis method and system based on deep multi-modal feature fusion, and the method comprises the steps: processing text data through a sample feature extraction network, a visual feature extraction network based on Resnet-18, and a text feature extraction network based on one-hot coding; after features are extracted, the features are fused, a visual feature and text feature fusion network based on an attention mechanism, extracted visual features and text features are fused, a visual feature self-attention network based on the attention mechanism is adopted, and feature expression of a visual mode is enhanced; mapping the fusion features into a multi-modal vector space by using a feature representation network based on multi-layer mapping; and finally, a fusion feature compatibility calculation network is used, a fusion feature positive pair distance is shortened in a multi-modal vector space, and a negative pair distance is expanded. According to the method and the device, the fashion single items can be reasonably matched, and the accuracy of the fashion single item matching result is improved.

Description

technical field [0001] The present application belongs to the field of computer vision and image applications, and in particular relates to a fashion compatibility analysis method and system based on deep multimodal feature fusion. Background technique [0002] A suitable suit usually relies on a good match through the complementarity between fashion items, so it is very meaningful to study an automatic clothes matching algorithm. Among the existing researches on fashion compatibility, there are few studies on the fusion of multimodal information of fashion items. Multimodal information is the expression of the characteristics of the same item from different description angles. Different angles contain different information, and the fusion of different information ensures the integrity of the characteristics of the item. The feature expression of a single product is the premise and key to accurately establish the relationship model between single products, so the multi-moda...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/44G06V10/82G06N3/04G06N3/08
CPCG06N3/08G06N3/045
Inventor 李云王学军井佩光
Owner GUANGXI UNIVERSITY OF FINANCE AND ECONOMICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products