Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Attention mechanism relationship comparison network model method based on small sample learning

A small sample, attention technology, applied in neural learning methods, biological neural network models, computer components, etc., can solve problems such as difficult model classification accuracy

Active Publication Date: 2019-07-16
BEIJING TECHNOLOGY AND BUSINESS UNIVERSITY +1
View PDF7 Cites 65 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Using deep convolutional networks to extract image features is a critical step in the process of small-sample learning. However, using existing meta-learning methods for small-sample learning tasks makes it difficult for deep convolutional networks to improve the classification accuracy of the model. And can stabilize the final training result of the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Attention mechanism relationship comparison network model method based on small sample learning
  • Attention mechanism relationship comparison network model method based on small sample learning
  • Attention mechanism relationship comparison network model method based on small sample learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0082] Below in conjunction with accompanying drawing, further describe the present invention through embodiment, but do not limit the scope of the present invention in any way.

[0083] This paper proposes an attention mechanism relational contrastive network learning method based on small sample learning—ARCN method. The network realizes end-to-end learning. By introducing spectral normalization and attention mechanism in the convolutional neural network, the feature information of the small sample image is extracted, and the feature information is further recombined. Finally, the relationship between the features is realized through the relationship encoding module. The learning of inter-depth metric methods compares the relationship between images to achieve higher accuracy and more stable training for small sample images. image 3 It is the overall flowchart of the realization of the present invention.

[0084] The following embodiments are aimed at the small-sample publ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an attention mechanism relationship comparison network model method. An attention relationship comparison network model for small sample learning under a small amount of labeled sample data is constructed. Based on a relational network architecture, the model is divided into a feature coding part, a feature combination part and a relational coding part, the feature coding module is used for extracting image feature information, and the feature combination part is used for recombining the extracted query image feature information with the training image feature information of each group to form a new combined feature map. The relation encoding module performs nonlinear metric learning of the network; by introducing an attention mechanism and a spectrum normalizationmethod into an end-to-end deep convolutional neural network model, the model has higher classification accuracy under the condition of small sample learning, the stability of a final training result of the model is improved, and the image classification accuracy of an existing model in small sample learning is improved.

Description

technical field [0001] The invention belongs to the technical fields of image processing, pattern recognition and machine vision, and relates to image classification and recognition network model technology, in particular to a network model method based on small sample learning attention mechanism relation comparison, which is constructed on a small amount of labeled sample data for small Compared with the network model, the attention relationship of sample learning can effectively improve the accuracy of image classification under small sample learning. Background technique [0002] In recent years, the unprecedented breakthroughs of deep learning in various fields have largely relied on the large amount of available labeled data, which requires a lot of cost to collect and annotate, which severely limits the ability to learn new categories. More importantly, these deep learning models are difficult to solve the problem of a small amount of labeled data. Therefore, the use...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/045G06F18/213G06F18/214
Inventor 于重重马先钦冯文彬
Owner BEIJING TECHNOLOGY AND BUSINESS UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products