A plant image fine-grained classification method based on discriminant key domains and deep learning
A technology of deep learning and classification methods, applied in character and pattern recognition, biological neural network models, instruments, etc., can solve the problem of low accuracy of fine-grained classification of plant images, and achieve improved classification accuracy, improved classification accuracy, and high The effect of research value
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0029] The present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments. This embodiment is carried out on the premise of the technical solution of the present invention, and detailed implementation and specific operation process are given, but the protection scope of the present invention is not limited to the following embodiments.
[0030] The present invention provides a plant image fine-grained classification method (referred to as DL-CNN) based on discriminative key domains and deep learning, using a CNN classification model that simultaneously considers key domains and global domains to carry out fine-grained classification of images to be classified, based on DeepLab The method realizes the pixel-level semantic segmentation of plant images, finds the key areas with discriminative significance in the image, and combines the global domain, uses the CNN model to extract semantic features, and uses the softmax cl...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com