Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for generating image by sensing combined space attention text

A technology for image generation and attention, applied in image data processing, neural learning methods, 2D image generation, etc., can solve the problems of not being able to focus and refine all objects well, and the quality of generated results is inaccurate, and achieve Effects that improve perceived quality and layout, and reduce variance

Pending Publication Date: 2022-04-22
HUNAN UNIV
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For example, when the object category is not described in the text, the content of the generated image may be very different from the real image
Furthermore, although multi-stage methods are by far the best generative methods, when dealing with complex text with many objects (such as the COCO dataset), they cannot focus and refine all objects well, so the quality of the generated results is not accurate

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for generating image by sensing combined space attention text
  • Method for generating image by sensing combined space attention text
  • Method for generating image by sensing combined space attention text

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0085] The present invention proposes a perceptual joint spatial attention text-generated image method, which is based on a multi-stage adversarial generative network, and aims to improve the perceptual quality and layout of text-generated images. The idea of ​​this method is based on the dual-attention mechanism. Specifically, this method considers combining the word-level spatial attention method with the dynamic memory method and jointly responding to ensure that the generator focuses on the image sub-region corresponding to the most relevant word. content as well as location and shape. In addition, this method introduces a perceptual loss function for the last generator of the multi-stage text generation image model, with the purpose of reducing the difference between the final generated image and the target image, so that the image to be generated is more semantically similar to the target image.

[0086] In order to achieve the above goals, the following solutions are ad...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for generating an image by combining perception and spatial attention texts. The method comprises the following steps: generating an initial image, and drawing the basic shape and color of an object; image refining is carried out, and new image features refined from the spatial dimension and the word importance dimension are generated; the refining process comprises the following steps: fusing fine-grained word-level text information and image information, eliminating defects and adding details to an initial image in combination with a space attention mechanism and a dynamic memory mechanism, and enhancing image feature region characterization; by training a target function, a generator is encouraged to generate an image which is more real and more in line with text semantics. The method ensures that the generator focuses on the content, the position and the shape of the image sub-region corresponding to the most relevant word, avoids the randomness in the generation process, reduces the difference between the finally generated image and the target image, can improve the perception quality and the layout of the text generation image, and improves the text generation efficiency. The image can be efficiently and accurately generated.

Description

technical field [0001] The invention belongs to the field of text synthesis images, and in particular relates to a method for generating images of texts with combined perception and spatial attention. Background technique [0002] Text-to-image synthesis techniques hold great promise for applications in areas such as art generation and computer-aided design. Now, using text to generate images can not only greatly reduce the cost of matching images for text creators, but also improve the efficiency of computer creation. Therefore, it is highly necessary to find more efficient ways to generate realistic high-resolution images for text-image synthesis. [0003] Generative Adversarial Network (GAN for short) is based on the idea of ​​game theory, and constructs a generator model and a discriminator model through a deep neural network. The generator generates samples with random noise as input, and the discriminator judges whether the generated samples are real or not. In the ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T11/00G06F40/284G06V10/80G06N3/04G06N3/08G06V10/82
CPCG06T11/001G06F40/284G06N3/08G06N3/045G06F18/253
Inventor 赵欢赵玉青李婷婷陈恩思李博
Owner HUNAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products