Fly face recognition method based on deep convolutional neural network

A convolutional neural network and deep convolution technology, applied in the field of biometric identification, can solve the problems of not paying enough attention to the information difference between categories, low insect identification accuracy, and high similarity of flies, so as to reduce image gradient loss and prevent a large amount of information. The effect of losing and improving accuracy

Active Publication Date: 2020-09-29
DALIAN MARITIME UNIVERSITY
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But because it does not pay enough attention to the information difference between categories, the classification ability is limited
4) Fly image recognition based on deep learning, which uses a convolutional neural network to identify flies as a whole, but the similarity of flies is high, the extracted features of the whole body of flies have limitations, and the accuracy of insect recognition is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fly face recognition method based on deep convolutional neural network
  • Fly face recognition method based on deep convolutional neural network
  • Fly face recognition method based on deep convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0069] like figure 1 As shown, this embodiment includes the following steps:

[0070]Step 1: First, optimize the depthwise separable convolution on the basis of the MTCNN network, decompose the standard convolution in the MTCNN network into depthwise convolution and point-by-point convolution, and reduce the amount of calculation as much as possible under the premise of ensuring accuracy, so as to obtain Fly face detection box and five feature points. The MTCNN network is mainly composed of three parts: P-Net, R-Net and O-Net. The fully convolutional P-Net is used to generate candidate frames on multi-scale images to be checked, and then through R-Net and O-Net to Filtering, the total loss function formula is as follows:

[0071]

[0072] In the above formula, N is the total number of training samples, α j Indicates the weight of 0 for each loss. In P-Net and R-Net, set α det = 1, α box = 0.5, α landmark = 0.5, in O-Net, set α det = 1, α box = 0.5, α landmark =1. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a fly face recognition method based on a deep convolutional neural network. The method comprises the steps of optimizing a multi-task convolutional neural network MTCNN to perform face positioning and feature point detection on a fly image; obtaininig a fly face feature point image through positioning and detection, and carrying out fly face alignment; making the aligned flyface images into a data set; building a fly face deep convolutional neural network model; connecting two groups of feature vectors extracted from the thickness in the fly face deep convolutional neural network in series to form a group of feature vectors; and testing the fly face deep convolutional neural network model through the test set, and verifying the fly recognition effect. According to the method, a fly face deep convolutional neural network is adopted, contour features of an image are roughly extracted by using a large number of convolution pooling, and then specific part features such as small eyes in fly compound eyes are extracted by using Inception-ResNet and Reduction networks. According to the network, the learning target and difficulty can be simplified, and richer feature vectors can be extracted while gradient loss is prevented.

Description

technical field [0001] The present invention relates to the technical field of biological feature recognition, in particular, to a method of fly face recognition based on a deep convolutional neural network. Background technique [0002] In recent years, the field of biometric identification technology has developed rapidly. Especially with the increasingly frequent trade exchanges between countries in the world, the probability of foreign fly insects carried by passengers being introduced into our country is increasing, and it may even cause problems such as the destruction of the ecological environment. Therefore, research on efficient identification of fly insect species is urgent. At present, there are mainly methods for fly recognition: 1) Fly insect recognition based on color features, collecting image color histogram information and merging all color histograms into feature vectors as color features of fly insects. The color feature is not affected by image rotation...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06N3/04
CPCG06V40/161G06V40/168G06V40/172G06V10/454G06N3/045G06F18/214G06F18/24Y02T10/40
Inventor 陈彦彤王俊生陈伟楠
Owner DALIAN MARITIME UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products