Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Three-dimensional style transfer model based on double-path stylized network

A transfer model and dual-path technology, applied in the field of image processing, can solve problems such as large amount of calculation, multi-layer view loss, loss network computing perception loss, etc.

Inactive Publication Date: 2018-10-19
SHENZHEN WEITESHI TECH
View PDF0 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problems of introducing artifacts and a large amount of calculation, the purpose of the present invention is to provide a stereoscopic style transfer model based on a dual-path stylization network. The stereoscopic style transfer model is composed of a dual-path stylization network and a loss network. The path stylization network takes stereo pairs and processes each view in a single path; a feature fusion block is embedded into the stylization network to effectively share feature-level information between these two paths; the loss network computes perceptual loss and multi-layer view loss to coordinate the training of the two paths of the stylization network to generate view Figure 1 consistent stylized results

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Three-dimensional style transfer model based on double-path stylized network
  • Three-dimensional style transfer model based on double-path stylized network
  • Three-dimensional style transfer model based on double-path stylized network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0032] figure 1 It is a system frame diagram of a three-dimensional style transfer model based on a dual-path stylization network in the present invention. It mainly includes dual-path stylized network and loss network. The dual-path stylization network adopts stereo pairs and processes each view in a single path; a feature fusion block is embedded into the stylization network to effectively share feature-level information between these two paths; the loss network computes perceptual loss and multiple layer-view loss to coordinate the training of the two paths of the stylization network to generate the view Figure 1 consistent stylized results.

[0033] figure 2It is a dual-path ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention proposes a three-dimensional style transfer model based on a double-path stylized network, and the main contents of the method comprise the double-path stylized network and a loss network. The process of the method is that a three-dimensional style transfer model consists of two parts: the double-path stylized network and the loss network; the double-path stylized network employs a stereo pair, and each view is processed in a single path; a feature fusion block is embedded into the stylized network, thereby achieving the effective sharing of the feature-level information betweenthe two paths; the loss network calculates the sensing loss and the multilayer view loss, so as to coordinate the training of the two paths of the stylized network, and generate a stylization result with the consistent views. The invention proposes the novel feature fusion block for transmitting the information to the other path from one path, and the method can generate the stylization result with the better view consistency. Moreover, the quality of an image cannot be affected.

Description

technical field [0001] The invention relates to the field of image processing, in particular to a stereo style transfer model based on a dual-path stylization network. Background technique [0002] With the advancement of technology, more and more novel devices provide people with various visual experiences. Among them, devices that provide an immersive visual experience are among the most popular, including virtual reality devices, augmented reality devices, 3D movie systems, and 3D TVs. One common component that these devices share is stereo imaging technology, which creates the illusion of depth in stereopsis through stereo vision through binocular vision. In order to provide more attractive visual experience, a lot of research applies attractive visual effects to stereoscopic images. Neural style transfer is one of the emerging techniques that can be used to achieve this goal. However, although the existing methods meet the requirements of keeping the visual Figure 1...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T19/20G06K9/62
CPCG06T17/00G06T19/20G06T2207/10012G06T2207/20081G06T2207/20084G06T2207/20228G06T2219/2024G06F18/253
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products