Chopstick image classification method based on adaptive convolutional neural network

A convolutional neural network and classification method technology, applied in chopsticks image classification, chopsticks image classification based on adaptive convolutional neural network, can solve problems such as unqualified chopsticks that cannot be detected and smaller than set parameters, and improve quality , the effect of improving core competitiveness and high classification accuracy

Active Publication Date: 2019-08-16
XIDIAN UNIV
View PDF6 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005]The shortcomings of existing methods are that the existing methods directly classify chopsticks through traditional machine vision technology, without using neural network technology, The traditional machine vision technology will summarize the characteristics of unqualified chopsticks for specific chopstick detection targets in the assembly line, and then set fixed parameters such as color detection threshold, crack width and length detection threshold, and then according to the fixed parameters. Therefore, traditional machine vision technology can only detect unqualified chopsticks that reach the set parameters, but cannot detect unqualified chopsticks that are smaller than the set parameters. The neural network technology extracts features such as whether the chopstick image contains defects or not. Judging whether the chopsticks to be detected are qualified, so the neural network technology can classify the chopsticks more accurately

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Chopstick image classification method based on adaptive convolutional neural network
  • Chopstick image classification method based on adaptive convolutional neural network
  • Chopstick image classification method based on adaptive convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032]Chopsticks are not only China, but also the most important tableware in Asia. With the continuous development of the economy, people's requirements for diet and hygiene are constantly improving, so the requirements for tableware, especially chopsticks, are also gradually increasing. People not only require chopsticks to be durable and free from defects, but even engrave images on chopsticks, so the quality requirements for chopsticks are gradually increasing. In order to ensure food hygiene, the replacement frequency of chopsticks in normal families, restaurants and canteens is constantly increasing; with the vigorous development of the takeaway industry, the demand for disposable chopsticks has also grown exponentially; most of the traditional chopsticks are made of wood, bamboo, stainless steel and other materials, and with the continuous development of the manufacturing industry and the demand for environmental protection, many new materials are also used to make chop...

Embodiment 2

[0046] The chopsticks image classification method based on adaptive convolutional neural network is the same as embodiment 1, the construction initial adaptive convolutional neural network described in step (2), its structure is successively: the first convolutional layer→the first descending Sampling layer→second convolutional layer→second downsampling layer→third convolutional layer→third downsampling layer→fourth convolutional layer→fourth downsampling layer→fifth convolution Layer → fifth downsampling layer → first dropout layer → first fully connected layer.

[0047] The first ten layers of the twelve-layer initial adaptive convolutional neural network are composed of five interleaved convolutional layers and five downsampling layers. The five convolutional layers can fully guarantee the feature extraction capability of the network. Improve classification accuracy; five downsampling layers can fully compress the size of data, reduce the amount of calculation of the networ...

Embodiment 3

[0049] The chopstick image classification method based on the adaptive convolutional neural network is the same as that in Embodiment 1-2, and the parameters of each layer in the initial adaptive convolutional neural network constructed by the present invention can be changed according to the characteristics of the current chopsticks data set to include five kinds of defects Taking the bamboo chopsticks data set as an example, the parameters are set as follows:

[0050] The total number of input layer feature maps of the adaptive convolutional neural network is set to 3, and the feature map size is set to 80×500.

[0051] Set the total number of convolution filters in the first convolution layer to 32, the pixel size of the convolution filter to 3×3, the feature map size to 80×500, and the convolution step to 1 pixel.

[0052] The size of the pooling area in the first downsampling layer is set to 3×3, the pooling step is set to 3 pixels, and the feature map size is set to 27×1...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a chopstick image classification method based on an adaptive convolutional neural network, which solves the problem of adaptive precise classification of different materials and different types of chopsticks, and comprises the following steps of: normalizing chopstick images; establishing a twelve-layer network model; training the network and calculating the accuracy of thenetwork; randomly selecting one convolutional layer as an extended convolutional layer, expanding the convolutional neural network, respectively adding seven network structures at the extended convolutional layer to obtain seven neural networks, training and testing the seven neural networks, selecting one neural network with the highest test accuracy, and continuing network expansion if the accuracy of the selected network is greater than that of the current network; and training and testing the optimal self-adaptive convolutional neural network. According to the method, the self-adaptive convolutional neural network is constructed, so that the structure of the convolutional neural network can be adjusted along with the training process, the depth and width of the network are increased,and the finally constructed self-adaptive convolutional neural network can accurately classify the chopsticks.

Description

technical field [0001] The invention belongs to the technical field of image processing, and further relates to chopstick image classification, in particular to a chopstick image classification method based on an adaptive convolutional neural network. The invention is mainly used for classifying the images of chopsticks, and can be used for the processing and production of chopsticks. Background technique [0002] Chopsticks refer to chopsticks processed from raw materials such as bamboo and wood in industrial production and processing. In the production process of chopsticks, if the raw materials used are polluted or damaged, the final chopsticks produced will be defective. And even if the raw materials used are not defective, due to the possibility of machine failures, manual operation errors and other problems during the production process, the finished chopsticks will also have various defects. Among them, the defect types of chopsticks mainly include dead spots, green...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62
CPCG06F18/24Y02P90/30
Inventor 庞博盛立杰苗启广
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products