Clothes classifying method based on convolutional neural network

A technology of convolutional neural network and classification method, which is applied in the field of image information processing, can solve the problems of unsatisfactory classification effect and low classification accuracy, and achieve the effect of convenient and accurate feature extraction and low resolution accuracy

Inactive Publication Date: 2015-12-02
NANJING UNIV OF INFORMATION SCI & TECH
View PDF6 Cites 37 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the limitations of artificially designed features, the effect of traditional methods depends largely on whether the selected features are considered reasonable, which has a lot of blindness and generally has the problem of low classification accuracy.
Therefore, the current clothing classification algorithm has two main limitations:
First, traditional features cannot achieve satisfactory classification results, especially for categories with similar attributes
Second, there is currently no publicly available clothing database to objectively evaluate existing algorithms

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Clothes classifying method based on convolutional neural network
  • Clothes classifying method based on convolutional neural network
  • Clothes classifying method based on convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0026] Such as figure 1 As shown, it is a schematic flow chart of the clothing classification method based on the convolutional neural network of the present invention, according to figure 1 The following examples are described in detail.

[0027] Step 1. Obtain clothing images, establish training samples and test samples. According to the common clothes styles in the market, the clothing is divided into 16 categories, including 8 categories of men's clothing and 8 categories of women's clothing. The pictures corresponding to this attribu...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a clothes classifying method based on a convolutional neural network. The method comprises the following steps of acquiring clothes image samples, and dividing the samples into training samples and testing samples; preprocessing the training samples and the testing samples; constructing a convolutional neural network model; performing training of two stages including a forward propagation stage and a backward propagation stage on the convolutional neural network model through preprocessed training samples, finishing the training when the error calculated during the backward propagation stage reaches a desired value, and acquiring a parameter of the convolutional neural network model; testing the preprocessed testing samples by using the trained convolutional neural network model and outputting final clothes classifying results. The convolutional neural network model can make clothes images directly serve as network inputs, extract image features in an implicit way and establish a global feature expression. Compared with a manually designed feature extraction way, the method is more convenient and accurate. The problem that a conventional algorithm leads to low clothes classifying accuracy is solved.

Description

technical field [0001] The invention relates to a clothing classification method based on a convolutional neural network (ConvolutionalNeuralNetworks, CNN), which belongs to the technical field of image information processing. Background technique [0002] At present, researchers have proposed many algorithms for automatic clothing classification. Pan et al proposed to use BP neural network to identify knitted fabrics. Ben et al proposed a recognition method for knitted fabrics based on text features and support vector machines. Liu et al. proposed pose-based estimation and used features such as color, SIFT, HOG, etc. to classify clothes into 23 categories. Bourdev et al. developed a system to describe the appearance of people, using 9 attributes, such as male, T-shirt, long hair and other characteristics. In addition, the segmentation of clothes is also a research hotspot. proposed to use a constrained Delaunay triangle (CDT) based foreground and background estimation, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/66
CPCG06V30/194
Inventor 刘青山王枫厉智杨静邓健康
Owner NANJING UNIV OF INFORMATION SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products