Garment image retrieval method fusing color feature and residual network depth feature

A technology based on color features and network depth, applied in still image data retrieval, digital data information retrieval, still image data clustering/classification, etc. Keep the spatial structure and other issues to achieve the effect of increasing the calculation time and difficulty, the effect of style and color similarity is obvious, and the effect of improving retrieval efficiency

Pending Publication Date: 2020-02-21
WUHAN TEXTILE UNIV
View PDF0 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] (1) The existing technology only extracts the fully connected layer and cannot maintain the spatial structure, and this feature is more representative of global information, and the local feature information of the clothing pictur

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Garment image retrieval method fusing color feature and residual network depth feature
  • Garment image retrieval method fusing color feature and residual network depth feature
  • Garment image retrieval method fusing color feature and residual network depth feature

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0064] Example 1

[0065] 1. Multi-feature fusion clothing image retrieval framework based on deep network

[0066] The multi-feature fusion clothing image retrieval based on deep network includes two processes: feature extraction and similarity measurement. like figure 2 As shown in the figure, in the feature extraction process, the images in the dataset are first input into the pre-trained network model, and the deep features output through the network layer are extracted, and the aggregation method is used to fuse other feature information as the global feature representation of the image. Stored in the feature database; the similarity measurement process is to input the clothing pictures to be retrieved into the same neural network as the data set, and use the same aggregation method to obtain the global feature vector of the clothing pictures to be queried. Query the distance between the image feature vector and the vector in the feature library to sort the similarity,...

Example Embodiment

[0088] Example 2

[0089] 1. Data and parameter preparation

[0090] In order to verify the effect of the method proposed by the present invention, Category and AttributePrediction Benchmark are selected as the data set in this experiment. The data set contains more than 200,000 sets of 50 categories of clothing pictures. This experiment extracts 60,000 training images from this subset. set, 20,000 test set, and 20,000 validation set for experiments, in which there are 30 categories of pictures. The experiment is compiled and implemented in Python.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of image retrieval, and discloses a garment image retrieval method fusing color features and residual network depth features, which comprises the followingsteps: inputting a training data set into a ResNet50-based network model; fusing the deep features and the color feature information to serve as global feature representation of the image; clusteringthe vectors in the feature library by using a K-Means algorithm; inputting a to-be-retrieved garment picture into the neural network the same as the data set, and obtaining a global feature vector ofthe to-be-queried garment picture; and sequentially calculating the distances between the vectors of the clustering centers and the vectors of the to-be-retrieved pictures, and performing similaritymeasurement through comparison of the distances to obtain a retrieval result. Experimental results show that the method can be combined with various feature information of the picture, the retrieval efficiency is high, and the time overhead is small. The extracted deep features have certain effectiveness and hierarchy. The method has high robustness and practicability and is superior to other mainstream retrieval methods.

Description

technical field [0001] The invention belongs to the technical field of image retrieval, and in particular relates to a clothing image retrieval method which combines color features and residual network depth features. Background technique [0002] Currently, the closest prior art: [0003] With the rapid development of the e-commerce industry, the clothing industry, as an important part of it, has an increasing amount of data. In order to deal with massive clothing image data, a new online clothing search mode is used by users - "searching images with pictures ", the core of which is image retrieval technology. As the core of clothing intelligent recommendation, clothing search and other applications, clothing image retrieval has broad market application prospects. Clothing shows a trend and taste of contemporary people, and a lot of semantic and detailed information is contained in it. The color matching and style of clothing are its important semantic information, and th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F16/55G06F16/583
CPCG06F16/55G06F16/5838Y02P90/30
Inventor 何儒汉侯媛媛刘军平彭涛陈常念胡欣荣
Owner WUHAN TEXTILE UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products