Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Image description method of self-attention mechanism based on sample adaptive semantic guidance

A sample adaptive and image description technology, applied in semantic analysis, neural learning methods, natural language data processing, etc., can solve the problems of fixed parameter generalization reduction, semantic noise, etc., achieve good performance, reduce precision loss, strong migratory effect

Pending Publication Date: 2021-12-24
XIAMEN UNIV
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0016] The purpose of the present invention is to provide a sample-based self-adaptive semantic guidance for the problem that the generalization of the traditional transformer-based image description method decreases when the parameters are fixed in the test phase, and the semantic noise in the current model using semantic information is too large. Image description method with self-attention mechanism

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image description method of self-attention mechanism based on sample adaptive semantic guidance
  • Image description method of self-attention mechanism based on sample adaptive semantic guidance
  • Image description method of self-attention mechanism based on sample adaptive semantic guidance

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0071] The following embodiments will describe the present invention in detail in conjunction with the accompanying drawings.

[0072] Embodiments of the present invention include the following steps:

[0073] 1) For the images in the image library, first use the convolutional neural network to extract the corresponding image features A;

[0074] 2) For the images in the image library, use the semantic concept extractor to extract the semantic concept C;

[0075] 3) Send the image feature A and language concept C into different self-attention networks, and further encode the features to obtain the corresponding hidden features and

[0076] 4) Hiding the aforementioned semantic concept features Send it to the parameter generation network to generate the parameter W of the self-attention network DQ , W DK ;

[0077] 5) Hiding the aforementioned image features Input to the generated self-attention network to obtain semantically guided image features O;

[0078] 6) In...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an image description method of a self-attention mechanism based on sample adaptive semantic guidance, and belongs to the technical field of artificial intelligence. To solve the defect that parameters of each sample are fixed in a traditional method adopting a self-attention mechanism, the method comprises the following steps: 1) extracting features corresponding to a plurality of candidate areas of a to-be-described image by a target detector; 2) extracting a plurality of semantic concepts for the to-be-described image by adopting a semantic concept detector; 3) performing feature enhancement on the features extracted in the step 1) and the step 2) through different self-attention networks; 4) generating a parameter of a self-attention network by using the semantic concept feature enhanced in the step 3) and a parameter generation network; 5) inputting the visual features enhanced in the step 3) into the generated self-attention network, and realizing better visual expression through the self-attention network generated semantically; and 6) inputting the visual features output in the step 5) into a decoder, generating a description statement of the image, and defining a loss function.

Description

technical field [0001] The invention relates to automatic image description in the field of artificial intelligence, in particular to an image description method based on a self-attention mechanism guided by sample self-adaptive semantics, which uses natural language to describe the objective content of images. Background technique [0002] Automatic image description (Image Captioning) is an ultimate machine intelligence task proposed by the artificial intelligence community in recent years. Its task is to describe the objective content of the image in natural language for a given image. With the development of computer vision technology, tasks such as target detection, recognition, and segmentation can no longer meet people's production needs, and there is an urgent need for how to automatically and objectively describe image content automatically. Different from tasks such as target detection and semantic segmentation, automatic image description requires an overall and o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06F40/284G06F40/30G06N3/04G06N3/08
CPCG06F40/284G06F40/30G06N3/04G06N3/08G06F18/24Y02T10/40
Inventor 纪荣嵘纪家沂李毅男
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products