Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Commodity classification method based on multi-modal deep neural network model

A technology of deep neural network and classification method, which is applied in the field of commodity classification based on multi-modal deep neural network model, can solve the problems of decision-making fusion method learning and data fusion method, which is difficult to directly combine different modal data, and achieves a small number of parameters , fast training speed and simple network structure

Pending Publication Date: 2021-01-15
HOHAI UNIV
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method solves the problem that data fusion methods may have difficulties in directly combining different modal data in specific classification tasks and the problem that decision fusion methods cannot learn the correlation between different modalities well.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Commodity classification method based on multi-modal deep neural network model
  • Commodity classification method based on multi-modal deep neural network model
  • Commodity classification method based on multi-modal deep neural network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] Embodiments of the invention are described in detail below, examples of which are illustrated in the accompanying drawings. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0036] Most of the currently proposed deep neural network models for commodity classification are based on unimodal data only. In actual situations, there are many forms of information about products, including product text, product images, videos, etc., and models based on unimodal data do not make full use of these information. Based on this requirement, the present invention proposes a commodity classification method based on a multimodal deep neural network model.

[0037] Such as figure 1 Shown, a kind of commodity classification method based on multimodal deep neural network model of the present invention comprises the following steps:

[0038] 1. First create a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a commodity classification method based on a multi-modal deep neural network model. The method comprises the steps: firstly converting text information into a word vector whichcan better reflect the relation between words, and carrying out the feature extraction of commodity text description through employing a text classification model TextCNN; in order to prevented over-fitting, using image augmentation operations such as random image flipping and random image brightness change on the images, and then inputting the processed result into a commodity image classification model ResNet101 to perform feature extraction on the commodity images; flattening the feature vectors extracted from the two models through a flatten function, directly connecting the feature vectors of the two modal data on the feature dimension, and finally sending the feature vectors to a classifier to classify the commodities. According to the commodity classification method, the limitationof traditional single-mode data on commodity classification is prevented, the text data and the picture data are combined, and the classification performance and the classification accuracy are better than those of a model using the single-mode data.

Description

technical field [0001] The invention relates to a commodity classification method, in particular to a commodity classification method based on a multimodal deep neural network model, and belongs to the technical field of commodity classification. Background technique [0002] With the vigorous development of e-commerce platforms, the category system construction method based on big data mining will gradually replace the method based on manual construction due to its advantages of speed, efficiency, automation, and low cost. In modern times, Internet technology is developing at a high speed every day, and e-commerce is becoming more and more important in people's lives. Online shopping has become the first choice for many people to buy goods. With the increase in the number of commodities, finding a commodity classification method with fast classification speed and high classification accuracy is not only an urgent need for users to choose the commodities they need from a lar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F16/35G06K9/62G06N3/04G06Q30/06
CPCG06F16/353G06Q30/0643G06N3/045G06F18/214
Inventor 刘凡高瑞涿邓言仪张伟娟
Owner HOHAI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products