Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Adaptive clothes animation modeling method based on visual perception

A visual perception and self-adaptive technology, applied in the field of virtual reality and computer graphics, can solve the problems that it is difficult to ensure the visual fidelity of clothing animation, and only consider the objective authenticity of clothing movement, etc., to achieve realistic visual effects and improve simulation efficiency.

Active Publication Date: 2017-09-26
NORTH CHINA ELECTRIC POWER UNIV (BAODING)
View PDF8 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, the common problem existing in the existing methods is that when modeling animation, only the objective reality of clothing movement is considered, and the influence of the subjective effect of the human eye on the perceived realism is completely ignored.
Therefore, the human visual system’s perception of the deformation of different regions of clothing will be affected by many factors; if we simply consider building a physically realistic animation model or improving the accuracy of the clothing model, and completely ignore the impact of visual sensitivity on perceived authenticity, then It is difficult to guarantee the visual fidelity of the final generated clothing animation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Adaptive clothes animation modeling method based on visual perception
  • Adaptive clothes animation modeling method based on visual perception
  • Adaptive clothes animation modeling method based on visual perception

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] The present invention will be described in further detail below in conjunction with the accompanying drawings.

[0022] A method for modeling adaptive clothing animation based on visual perception, comprising the following steps:

[0023] 1. Construct a clothing visual saliency model that conforms to the characteristics of the human eye

[0024] 1.1 Eye movement data collection and preprocessing

[0025] The present invention uses a telemetry eye tracker to collect real eye movement data. People watch clothing animation videos in front of the screen. Focus map and heat map of the video. Gaussian convolution was performed on the focus maps superimposed by multiple experimenters to obtain a continuous smooth "groundtruth" saliency map (such as figure 1 shown). From left to right in the figure, they are the original image, focus map, heat map and groundtruth saliency map.

[0026] 1.2 Constructing a visual saliency model using deep learning methods

[0027] In the pr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an adaptive clothes modeling method based on visual perception. The method comprises the steps of 1, constructing a clothes visual saliency model which accords with a human eye characteristic, applying deep convolutional neural network learning and extracting different hierarchical abstract characteristics of each image frame of clothes animation, and performing deep learning on the characteristics and true eye motion data for obtaining a visual saliency model; 2, performing clothes sub-area modeling, based on the visual saliency model which is constructed in the step 1, predicating a visual saliency chart of a clothes animation image, extracting attention degree of a clothes area, filtering clothes deformation, and performing sub-area modeling through setting a detail simulation factor according to camera viewpoint motion information and physical deformation information; and 3, constructing an adaptive clothes model driven by visual perception and realizing simulation, and realizing clothes sub-area modeling by means of adaptive multi-precision grid technology, performing high-precision modeling on the area with high detail simulation factor, and performing low-precision modeling on the area with low detail simulation factor, and performing dynamics calculation and bumping detection based on the steps above, and constructing a visual vivid clothes animation system.

Description

technical field [0001] The invention belongs to the technical field of virtual reality and computer graphics, and in particular relates to an adaptive clothing animation modeling method based on visual perception. Background technique [0002] The visual fidelity of clothing animation simulation effect has always been the goal that researchers are working on. In order to obtain delicate clothing effects, high-precision modeling of clothing models is usually required to express its rich deformation details. However, the high-precision model has nearly 10,000 primitives, which requires a large number of collision detection and large-scale dynamic equation solving, resulting in high computational costs and reduced system performance. [0003] In order to solve the above problems, an effective solution is to carry out adaptive multi-precision modeling on clothing. The existing methods mainly include: modeling method based on deformation state drive, that is, to estimate the pos...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T13/20G06N3/08
Inventor 石敏刘亚宁李昊毛天露
Owner NORTH CHINA ELECTRIC POWER UNIV (BAODING)
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products