Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Garment classification and collocation recommending method and garment classification and collocation recommending system based on deep convolution neural network

A convolutional neural network and neural network technology, applied in the field of clothing classification and collocation recommendation based on deep convolutional neural network, to achieve the effect of enriching collocation choices, high classification accuracy, and strengthening training

Inactive Publication Date: 2017-03-15
TSINGHUA UNIV
View PDF4 Cites 71 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there is no relevant technical solution to achieve the above purpose, that is, to provide consumers with matching recommendations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Garment classification and collocation recommending method and garment classification and collocation recommending system based on deep convolution neural network
  • Garment classification and collocation recommending method and garment classification and collocation recommending system based on deep convolution neural network
  • Garment classification and collocation recommending method and garment classification and collocation recommending system based on deep convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0049] In describing the present invention, it should be understood that the terms "center", "longitudinal", "transverse", "upper", "lower", "front", "rear", "left", "right", " The orientations or positional relationships indicated by "vertical", "horizontal", "top", "bottom", "inner" and "outer" are based on the orientations or positional relationships shown in the drawings, and are only for the convenience of describing the present invention and Simplified descriptions, rather than indicating or implying that the device or element refe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention proposes a garment classification and collocation recommending method and a garment classification and collocation recommending system based on a deep convolution neural network. The method comprises the following steps: adding batch-normalized and improved inception structures, adding a redundant classifier to improve an original GoogleNet convolution neural network, and extracting the features of garment images to get the classification result of the garment images; performing image augmentation on a collocation library training set in multiple ways, distorting and turning the garment images and transforming the color space of the garment images, and training an improved GoogleNet convolution neural network classification model; and looking for similar items and collocations thereof in a suit library, generating identity information for each garment image, comparing the identity information to get similar images, and recommending garment collocations according to the gender, style and function information corresponding to the garment images. Corresponding garment collocation advices can be given to consumers according to input garment images. The method and the system have the advantages of high speed and high precision.

Description

technical field [0001] The invention relates to the technical field of image classification, in particular to a clothing classification and matching recommendation method and system based on a deep convolutional neural network. Background technique [0002] Image recognition is a technology that acquires, analyzes, processes, and recognizes images in the field of computer vision to automatically obtain the main features of the object of interest in the target image. It is very important in many research fields such as object positioning, object classification, and gesture recognition. effect. In the industrial world, it also has excellent performance and extremely high commercial value in many application backgrounds such as face recognition, product recommendation, target tracking, and motion detection. In the field of artificial intelligence, such as unmanned driving, fully automatic robots and other fields, there are also good prospects for development. In the processin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06Q30/06G06K9/62G06F17/30
CPCG06F16/58G06Q30/0631G06F18/24
Inventor 黄双喜杨天祺
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products