Zero-sample image classification model based on repeated attention network and method thereof

A sample image and classification model technology, which is applied to computer components, character and pattern recognition, instruments, etc., can solve problems such as heavy workload, achieve effect improvement, image representation information is accurate, and the effect of alleviating the problem of strong bias

Active Publication Date: 2020-02-21
FUZHOU UNIV
View PDF7 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the number of image categories is often very large, and new categories may be added from time to time. If each category label is manually marked each time, the workload will be extremely huge
In this process, some categories have only a few or no training sample labels, and the entire category without training labels belongs to zero samples. Such zero samples cannot be effectively constructed by using traditional machine learning methods.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Zero-sample image classification model based on repeated attention network and method thereof
  • Zero-sample image classification model based on repeated attention network and method thereof
  • Zero-sample image classification model based on repeated attention network and method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The present invention will be further described below with reference to the accompanying drawings and embodiments.

[0040] Please refer to figure 1 , the present invention provides a zero-sample image classification model based on repeated attention network, including

[0041] Repeated attention network module for training and obtaining image region sequence information;

[0042] Generative adversarial network module for obtaining visual error information;

[0043] The visual feature extraction network processing module is used to obtain the one-dimensional visual feature vector of the image;

[0044] The attribute semantic transformation network module uses two layers of linear activation layers to map the low-dimensional attribute semantic vector to the high-dimensional feature vector with the same dimension as the visual feature vector;

[0045] The visual-attribute semantic connection network realizes the fusion of visual feature vector and attribute semantic fe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a zero-sample image classification model based on a repeated attention network, and the model comprises a repeated attention network module which is used for training and obtaining the sequence information of an image region; a generative adversarial network module, used for acquiring visual error information; a visual feature extraction network processing module, used forobtaining a one-dimensional visual feature vector of the image; an attribute semantic conversion network module, used for mapping a low-dimensional attribute semantic vector to a high-dimensional feature vector with the same dimension as the visual feature vector by using two linear activation layers; a vision-attribute semantic connection network, used for realizing fusion of a vision feature vector and an attribute semantic feature vector; and a score classification result and reward output module, used for classifying the types with the labels seen by adopting cross entropy loss, and reward output being used for punishing the unseen data without the labels and punishing the prediction result with the highest possibility of the seen types and the unseen types in the data without the labels. According to the invention, the problem of image category label missing can be effectively solved.

Description

technical field [0001] The invention relates to a zero-sample image classification model, in particular to a zero-sample image classification model based on a repeated attention network and a method thereof. Background technique [0002] Currently, in the process of image classification, if you want to accurately classify images, you need to inform the model of the image labels of each category. However, the number of image categories is often very large, and new categories may be added from time to time. If each category is manually labeled each time, the workload will be extremely huge. In this process, some categories have only a few or no training sample labels, and the whole category without training labels belongs to zero samples. Such zero samples cannot be effectively constructed by traditional machine learning methods. The purpose of zero-shot learning image classification is to solve the problem of missing the entire category label and classify the categories that...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/241
Inventor 廖祥文肖永强叶锴徐戈陈开志
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products