Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dynamic texture synthesis method based on double-identification flow convolutional network

A technology of convolutional network and dynamic texture, applied in the field of dynamic texture synthesis based on double recognition stream convolutional network, which can solve the problems of complex calculation, only applicable, and difficult to build a physical model.

Inactive Publication Date: 2017-10-20
SHENZHEN WEITESHI TECH
View PDF0 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the complexity of dynamic landscapes, these physical models are difficult to construct, computationally complex and only applicable to specific phenomena

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic texture synthesis method based on double-identification flow convolutional network
  • Dynamic texture synthesis method based on double-identification flow convolutional network
  • Dynamic texture synthesis method based on double-identification flow convolutional network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0034] figure 1 It is a system framework diagram of a dynamic texture synthesis method based on a dual recognition flow convolutional network of the present invention. It mainly includes the design of the apparent flow in the texture model, the design of the dynamic flow in the texture model, the generation of new textures, the synthesis of dynamic textures, and the conversion of texture styles.

[0035] Wherein, in the design of the apparent flow in the texture model, the apparent flow in the dual recognition flow is based on the spatial texture model, and the texture appearance is captured by training the feature correlation of different levels in the convolutional network in the tar...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a dynamic texture synthesis method based on the double-identification flow convolutional network. The method mainly comprises steps of design of an apparent flow in a texture model, design of a dynamic flow in the texture model, new texture generation, dynamic texture synthesis and texture style conversion. The method is characterized in that target identification and optical flow prediction are accomplished based on the pre-training convolutional network (ConvNet), after given dynamic state texture input, each appearance of input textures is packaged according to filter data responded by a target identification convolutional network, modeling of a dynamic state of the input textures is carried out according to filter data responded by an optical flow convolutional network. For generating new textures, optimization of a noise input sequence is required to match characteristics of each identification flow in the input textures, and appearance of one texture and a dynamic state of another texture are combined to generate a brand new dynamic texture. The method is advantaged in that a high quality sample matched with pixel-by-pixel appearance and time-varying conditions of the input textures can be generated.

Description

technical field [0001] The invention relates to the technical field of dynamic texture synthesis in computer graphics, in particular to a dynamic texture synthesis method based on a double recognition flow convolution network. Background technique [0002] Dynamic textures refer to image sequences with time-related repetitive features that describe a certain dynamic landscape. They exist widely in nature, such as ocean waves, waterfalls, flying flags, flying birds, and so on. Many scholars at home and abroad have done a lot of work on dynamic texture synthesis by using simulation methods based on physical models, and achieved good results in the synthesis of some phenomena. The synthesis method based on physical model simulation analyzes the physical laws of specific phenomena, establishes a simplified physical model, and performs illumination calculation and drawing. Due to the complexity of dynamic landscapes, these physical models are difficult to construct, computationa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/50G06T11/00G06T13/00
CPCG06T5/50G06T11/001G06T13/00G06T2207/20081G06T2207/20221
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products