An image description generation method and device based on a deep residual network and attention

An image description and attention technology, applied in the field of image processing, can solve the problems of deep neural network accuracy decline, etc., to solve the problem of gradient disappearance, solve the problem of degradation, solve the effect of saturation and decline

Active Publication Date: 2019-06-28
QILU UNIV OF TECH
View PDF15 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to overcome the deficiencies of the above-mentioned existing technologies, the present disclosure provides a method and device for generating image descriptions based on deep residual networks and attention, which solves the problem of decreased accuracy of deep neural networks, and uses dee

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An image description generation method and device based on a deep residual network and attention
  • An image description generation method and device based on a deep residual network and attention
  • An image description generation method and device based on a deep residual network and attention

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The present disclosure will be further described below in conjunction with the accompanying drawings and embodiments.

[0061] It should be noted that the following detailed description is exemplary and intended to provide further explanation of the present disclosure. Unless defined otherwise, all technical and scientific terms used in this disclosure have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.

[0062] It should be noted that the terminology used here is only for describing specific implementations, and is not intended to limit the exemplary implementations according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an image description generation method and device based on a deep residual network and attention. According to the method, the problem that the precision of the deep neural network is reduced is solved, the deep residual network is used for learning the image features of the image from the bottom layer to the top layer, rich input image representations are generated, and then natural and smooth description sentences are generated in combination with the attention circular long-short-term memory network. The method comprises the following steps: acquiring a large amountof image sample data, preprocessing the image sample data; extracting image features of the preprocessed image sample data; processing the extracted image features by using a residual neural network model to generate an image representation; mapping the image representation to an input of an attention-based cyclic long-short-term memory network language model, predicting word vectors of the imageby using the attention-based cyclic long-short-term memory network language model, and generating a descriptive sentence of the image.

Description

technical field [0001] The present disclosure relates to the field of image processing, in particular to an image description generation method and device based on a deep residual network and attention. Background technique [0002] Image description generation technology is closely related to technologies such as image semantic analysis, image annotation and image advanced semantic extraction. Deep learning has shown promising performance in recent years on both image and natural language processing tasks. [0003] In recent years, deep convolutional networks have made a series of breakthroughs in image classification and image recognition. The deep network makes the features richer by superimposing the depth of the layer. Many important visual recognition tasks also benefit from deep models. However, as the depth of the network increases, the accuracy begins to saturate, and then declines rapidly, and the problem of model degradation occurs. During the research and dev...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62G06N3/04G06N3/08
Inventor 杨振宇张姣
Owner QILU UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products