Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data-driving indoor scene coloring method

An indoor scene, data-driven technology, applied in image data processing, 3D image processing, instruments, etc., can solve cumbersome, time-consuming, troublesome and other problems

Active Publication Date: 2016-07-27
NANJING UNIV
View PDF3 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, for ordinary people, it is a tedious and time-consuming task to choose colors that are visually harmonious when combined together. Even professional interior decoration designers or artists can choose color matching based on their rich experience and intuition. Coloring is also a troublesome thing

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data-driving indoor scene coloring method
  • Data-driving indoor scene coloring method
  • Data-driving indoor scene coloring method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] The flow chart of this method is as follows figure 1 As shown in the figure, it is divided into six major processes: the first is to establish an image-model database and a texture database; then the model classifier is trained on the 3D model of each type of furniture in the image-model database; then the extraction of the furniture in the image-model database is performed. Color theme and establish a probability model; then the optimal coloring scheme is solved according to the established probability model and the color theme input by the user; then each 3D model in the input scene is segmented using the corresponding classifier; The resulting shading scheme assigns a corresponding material to each piece of furniture in the input scene.

[0063] Specifically, as figure 1 As shown, the present invention discloses a data-driven indoor scene coloring method, which mainly includes the following steps: Step 1, establish a database: collect images of different scenes, f...

Embodiment 2

[0107] The implementation hardware environment of this embodiment is: Intel Core i5-45903.3GHz, 8G memory, and the software environment is MicrosoftVisualStudio2010, MicrosoftWindows7Professional, and 3dsmax2012. The input model comes from the network.

[0108] The invention discloses a data-driven indoor scene coloring method, the core of which is to solve the optimal coloring scheme for each furniture in the scene, and divide the furniture grid according to the image, so as to assign corresponding materials to each part, Include the following steps:

[0109] Step 1, database establishment: collect images, models and material samples from the network, and process the collected data to establish an image-model database and a texture database;

[0110] Step 2, training the model classifier: perform feature extraction on the 3D model of each type of furniture in the image-model database and train the classifier;

[0111] Step 3, establish a probability model of image furniture...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data-driving indoor scene coloring method. The method comprises the steps of: establishing an image-model database and a texture database; training model classifiers for 3D models of each type of furniture in the image-model database; extracting color themes of the furniture in the image-model database and establishing a probability model; according to the established probability model and a color theme input by a user, solving out an optimal coloring scheme; utilizing the corresponding classifier to carry out segmentation on each 3D model in an input scene; and finally, determining the corresponding material of each piece of furniture in the input scene according to the obtained coloring scheme.

Description

technical field [0001] The invention belongs to the field of computer graphics, and relates to a data-driven indoor scene coloring method. Background technique [0002] For an indoor scene composed of multiple furniture models, the current research on the layout of the models is relatively mature, but there is no good progress on the model furniture and automatic coloring of the scene. The most intuitive feeling of a scene comes from color, so model coloring is crucial to building a beautiful and harmonious 3D scene. But for the average person, it is a tedious and time-consuming task to choose colors that combine visually harmoniously. Even a professional interior designer or artist can choose a color scheme based on their rich experience and intuition, and give a model a Coloring is also a hassle. [0003] But in fact, there are a large number of indoor scene images designed by designers or photographers on the Internet. Therefore, if the machine can learn from these exis...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06T15/04G06T19/20
CPCG06T15/04G06T19/00G06T19/20
Inventor 马晗郭延文朱捷夏元轶
Owner NANJING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products