Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Texture recognition model training method, texture migration method and related device

A technology for identifying models and training methods. Applied in the field of computer vision, it can solve the problems of high modeling cost and low efficiency, and achieve the effect of low learning threshold, cost reduction and efficiency improvement.

Pending Publication Date: 2022-04-29
BIGO TECH PTE LTD
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The present invention proposes a texture recognition model training, texture migration method and related devices to solve the problem of high modeling cost and low efficiency

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Texture recognition model training method, texture migration method and related device
  • Texture recognition model training method, texture migration method and related device
  • Texture recognition model training method, texture migration method and related device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] figure 1 It is a flow chart of a method for training a texture recognition model provided by Embodiment 1 of the present invention. This embodiment is applicable to the situation of self-supervised training of a texture recognition model. This method can be executed by a training device for a texture recognition model. The texture The training device for the recognition model can be implemented by software and / or hardware, and can be configured in computer equipment, such as servers, workstations, personal computers, etc., specifically including the following steps:

[0045] Step 101. Acquire first image data.

[0046] In this embodiment, multiple frames of first image data can be collected through channels such as public data sets, etc. The first image data is two-dimensional image data taken in real scenes, in which there are characters, animals, tools, Objects such as buildings, which have realistic textures.

[0047] Step 102, calling the texture detection network...

Embodiment 2

[0129] Figure 4 It is a flowchart of a texture migration method provided by Embodiment 2 of the present invention. This embodiment is applicable to the situation of migrating texture in image data. This method can be executed by a training device for a texture migration model. The texture migration The training device of the model can be implemented by software and / or hardware, and can be configured in computer equipment, for example, servers, workstations, personal computers, mobile terminals (such as mobile phones, tablet computers, etc.), etc., specifically include the following steps:

[0130] Step 401, load the texture recognition model.

[0131] In this embodiment, the texture recognition model can be pre-trained, such as Figure 5 As shown, the texture recognition model includes a texture detection network and a parameter detection network. The training method is as follows:

[0132] Obtain the first image data;

[0133] calling the map detection network to extract ...

Embodiment 3

[0169] Image 6 It is a structural block diagram of a training device for a texture recognition model provided in Embodiment 3 of the present invention. The texture recognition model includes a texture detection network and a parameter detection network. The device may specifically include the following modules:

[0170] An image data acquisition module 601, configured to acquire first image data;

[0171] A texture map extraction module 602, configured to call the texture detection network to extract multiple frames of texture maps from the first image data;

[0172] A texture parameter extraction module 603, configured to call the parameter detection network to extract texture parameters from the first image data;

[0173] An image data rendering module 604, configured to differentiably render the texture map and the texture parameters to the scene in the first image data to obtain second image data;

[0174] A network training module 605, configured to use the second imag...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a texture recognition model training method, a texture migration method and a related device. The method comprises the following steps: acquiring first image data; calling a map detection network to extract a plurality of frames of texture maps from the first image data; calling a parameter detection network to extract texture parameters from the first image data; differentially rendering the texture map and the texture parameter to a scene in the first image data to obtain second image data; the second image data is used as a supervision signal to train a map detection network and a parameter detection network, and the map detection network and the parameter detection network not only belong to automatic operation, are not perceived by a user and are low in learning threshold, but also have the capability of rendering textures to a scene, so that designers can reduce operations in the aspect of texture processing, and the user experience is improved. The operation convenience is improved, the time and energy consumed by modeling are reduced, the efficiency is improved, and the cost is reduced.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a texture recognition model training, a texture migration method and related devices. Background technique [0002] In scenes such as games and video entertainment, it is necessary to model characters, props, buildings, etc., that is, to design objects in proportion to scenes such as characters, props, and buildings. [0003] At present, the modeling work is usually realized by the designer manually using the modeling engine. The learning threshold of the modeling engine is relatively high, and the operation is relatively cumbersome, which makes the designer spend a lot of time and energy on modeling, with high cost and low efficiency. Low. Contents of the invention [0004] The invention proposes a texture recognition model training, a texture migration method and a related device to solve the problems of high modeling cost and low efficiency. [0005] In the first a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/44G06V10/82G06N3/04G06N3/08
CPCG06N3/084G06N3/048G06N3/045
Inventor 胡忠冰张彤
Owner BIGO TECH PTE LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products