Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automatic segmentation method for pre-background of lepidopteron image based on full convolutional neural network

A convolutional neural network, lepidopteran technology, applied in image analysis, image enhancement, image data processing, etc., to improve accuracy and efficiency, reduce labor costs, and eliminate background interference

Inactive Publication Date: 2018-11-02
ZHEJIANG GONGSHANG UNIVERSITY
View PDF8 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

It solves the problem of automatically extracting the effective area of ​​the foreground of insect image samples through computer deep learning technology, making the automatic identification of Lepidoptera insect species fully automatic

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automatic segmentation method for pre-background of lepidopteron image based on full convolutional neural network
  • Automatic segmentation method for pre-background of lepidopteron image based on full convolutional neural network
  • Automatic segmentation method for pre-background of lepidopteron image based on full convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

example 1

[0057] 1. Use the matting function module attached to "Light and Shadow Magic Hand" or the GrabCut+Lazy Snapping tool to interactively remove the background of the training and test sets of insect specimen images, and set the background to black and the foreground to white to obtain the front The target image for background segmentation.

[0058] 2. Randomly select 80% of the data set as training data, and the remaining 20% ​​as test data. For the training data set, image data enhancement methods such as rotation ± 5 degrees, left and right translation, up and down translation, brightness random scaling with factor c ∈ [0.8, 1.2], horizontal flipping, etc. are used to expand the library to more than 8 times the original size. In terms of operations such as , translation, and horizontal flipping, the target image for foreground and background segmentation should also be transformed accordingly. Data augmentation can effectively avoid over-fitting during network training.

[0...

example 2

[0065] 1. Use the matting function module attached to "Light and Shadow Magic Hand" or the GrabCut+Lazy Snapping tool to interactively remove the background of the training and test sets of insect specimen images, and set the background to black and the foreground to white to obtain the front The target image for background segmentation.

[0066] 2. Randomly select 80% of the data set as training data, and the remaining 20% ​​as test data. For the training data set, the image data enhancement methods such as rotation 5 degrees, left and right translation, up and down translation, brightness random scaling with factor c∈[0.8, 1.2], horizontal flipping, etc. are used to expand the library to more than 8 times the original size. In terms of translation and horizontal flip operations, the target image for foreground and background segmentation should also be transformed accordingly. Data augmentation can effectively avoid over-fitting during network training.

[0067] 3. Modify ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an automatic segmentation method for the pre-background of a lepidopteron image based on a full convolutional neural network (CNN). A full convolutional network applied to pixel-level classified prediction is built via a fine tuned and pre-trained CNN model. Before the network is training, data enhancement is performed on an insect image data set to meet the requirement of deep neural network training on a sample quantity. Outputs of the different convolutional layers are fused, and a network model applicable to segmenting the pre-background of the lepidopteron image isacquired by exploration. Edge details are refined by further using a condition random field (CRF) based on an initial segmentation result of the CNN, and the maximum outline of the foreground is extracted and filled to remove noise interference existing in an output result of the network model and cavities in the foreground. According to the method, the pretreatment task of the insect image is completely automatic, and the lepidopteron species automatic recognition efficiency can be significantly improved.

Description

technical field [0001] The invention relates to an automatic foreground and background segmentation method of a Lepidoptera insect image based on a fully convolutional neural network (FCN), which can eliminate background interference during the insect image analysis and recognition process, and improve the accuracy and efficiency of analysis and recognition. This method can fully automate the image preprocessing task in the automatic classification and recognition of insect images, thereby reducing the labor cost of manual segmentation. This technology can be integrated into the automatic identification system of Lepidoptera insects, and applied to the fields of plant quarantine, plant pest forecasting and prevention, and can be adopted by customs, plant quarantine departments, agricultural and forestry pest control and other departments. Background technique [0002] Lepidoptera insects are one of the main pests in agriculture. In the larval stage, they eat the mesophyll of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/194G06K9/62
CPCG06T7/194G06T2207/20081G06T2207/20084G06F18/25
Inventor 竺乐庆马梦园
Owner ZHEJIANG GONGSHANG UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products