Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Zero sample target classification method based on pseudo sample feature synthesis

A technology of object classification and pseudo-samples, applied in the field of zero-sample object classification based on the synthesis of pseudo-sample features, can solve the problems of low accuracy and achieve the effects of improving operating efficiency, ensuring richness, and avoiding domain drift problems

Inactive Publication Date: 2019-10-08
HARBIN ENG UNIV
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] This method mainly involves converting the zero-sample target classification problem into a traditional supervised learning problem by performing pseudo-feature synthesis on unseen classes, so as to effectively overcome the low accuracy of the traditional zero-sample target recognition method.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Zero sample target classification method based on pseudo sample feature synthesis
  • Zero sample target classification method based on pseudo sample feature synthesis
  • Zero sample target classification method based on pseudo sample feature synthesis

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] The present invention will be described in further detail below with reference to the accompanying drawings and specific embodiments.

[0028] like figure 1 Shown flow process, the concrete steps of the inventive method are as follows:

[0029] A zero-sample target classification method based on pseudo-sample feature synthesis, including the following steps:

[0030] Step 1: Name the category to be identified as the unseen category, first obtain the samples and annotation information of other categories similar to the category to be identified, and name it as the visible category dataset;

[0031] Step 2: Obtain the semantic information of all categories such as visible classes and unseen classes through web crawling.

[0032] Step 3: Calculate the similarity score between each unseen class and every other visible class category according to the semantic vector;

[0033] Step 4: Build a convolutional neural network classification model. The model is divided into two ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a zero sample target classification method based on pseudo sample feature synthesis. The method comprises the following steps of obtaining the samples and the annotation information of other categories similar to a category to be identified; secondly, obtaining the semantic descriptions of a visible class and a non-visible class by means of network capture and the like, andconverting the class description information into the semantic vectors through a natural semantic processing model; calculating a similarity score between each unseen class and each visible class; constructing a convolutional neural network classification model, wherein the model is divided into a feature extraction part and a classification part; for each unseen class, screening N visible classeswith highest scores according to the similarity scores, randomly selecting the samples, and inputting the samples into a feature extraction network to obtain the feature vectors; combining the feature vectors of the N visible classes according to the similarity score to serve as the feature vectors of the unseen classes; training the classification network by using the unseen feature vectors, sothat the samples of the category can be accurately identified under the condition that there is no available training samples of a certain category of to-be-identified targets.

Description

technical field [0001] The invention relates to a zero-sample target classification method in mark recognition, in particular to a zero-sample target classification method based on pseudo-sample feature synthesis. Background technique [0002] Supervised learning classification methods have achieved great success in all walks of life. Supervised learning learns classification functions through a large amount of labeled training data. However, for a specific category, collecting a large amount of data and annotating it is very time-consuming and expensive. Inefficiency, and even collecting a small amount of data in some fields is very difficult. Therefore, the research on the zero-sample target recognition method for the application of the target to be recognized with very few samples or even no available training samples has important application value. [0003] Different from the traditional supervised learning method, the purpose of the zero-sample target recognition metho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62
CPCG06F18/241
Inventor 叶秀芬李传龙刘文智李海波韩亚潼
Owner HARBIN ENG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products