Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Face appearance editing method based on real-time video proper decomposition

A real-time video and editing technology, applied in image analysis, image enhancement, instruments, etc., can solve the problem that the decomposition is not accurate eigendecomposition, does not consider hair and neck, limited editing, etc., to reduce computing overhead and improve eigendecomposition Efficiency and versatility

Active Publication Date: 2016-09-21
ZHEJIANG UNIV +1
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Since the image is not eigen-decomposed, the implementable authenticity-guaranteed editing of these methods is limited.
A work in 2009 decomposed the face image into structure layer, detail layer and color difference layer (GUO, D., AND SIM, T.2009.Digital face makeup by example.In Proc.IEEE CVPR'09,73–79 .), but this decomposition is not an exact eigendecomposition
This method can achieve the best results of the current technology, but the amount of calculation is huge; and it only considers the skin area of ​​the human face, without considering the hair and neck, and these areas are the areas that can be edited by the present invention

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face appearance editing method based on real-time video proper decomposition
  • Face appearance editing method based on real-time video proper decomposition
  • Face appearance editing method based on real-time video proper decomposition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0079] The inventor has realized the implementation example of the present invention on a machine equipped with Intel dual-core i5 central processing unit, NVidia GTX660 graphics processor and 16GB internal memory. The inventors obtained all experimental results shown in the accompanying drawings using all parameter values ​​listed in the detailed description. For a webcam with a resolution of 640×480, most ordinary users can complete the interactive segmentation within one minute, and the automatic preprocessing time of the reference image is usually 30 seconds, of which GMM fitting takes 10 seconds. The construction of the lookup table It takes less than 20 seconds. In the running stage, the processing speed of the system exceeds 20 frames per second, and the processing content includes face tracking, correspondence of different regions based on graph cuts, eigendecomposition and appearance editing.

[0080] The inventors conducted experiments on various face appearance edi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a face appearance editing method based on real-time video proper decomposition. Before video playing, a user edits a proper albedo layer and a shading layer of a reference face image; and in video playing, and the edited actions are transmitted to a layer which corresponds with the face of the video flow in real time. The method mainly comprises the following steps of processing the face image, performing real-time proper decomposition of the video flow, and performing face appearance editing of the video flow. According to the face appearance editing method, proper decomposition technology in a real-time video is firstly presented. By means of the proper decomposition technology, a plurality of kinds of face appearance editing actions such as face wrinkle elimination, ambient light change and hue transmission can be realized in the real-time video.

Description

technical field [0001] The invention relates to the field of computer video processing, in particular to a method for performing intrinsic decomposition and appearance editing on real-time video streams of human faces. Background technique [0002] There have been a lot of research work on the editing of face images, such as the replacement of different faces (BITOUK, D., KUMAR, N., DHILLON, S., BELHUMEUR, P.N., AND NAYAR, S.K. 2008. Face Swapping: Automatically Replacing Faces in Photographs.ACM Trans.Graph.27,3,39.) and morphing faces to more attractive face structures (LEYVAND,T.,COHEN-OR,D.,DROR,G.,AND LISCHINSKI,D . 2008. Data-driven enhancement of facial attractiveness. ACM Trans. Graph. 27, 3, 38.). Unlike these efforts to alter faces, the present invention aims to edit the appearance of faces. Another type of editing work is to synthesize human face organs to change expressions (YANG, F., WANG, J., SHECHTMAN, E., BOURDEV, L., AND METAXAS, D.2011.Expression flow for...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T3/00G06T7/00
CPCG06T2207/30201G06T2207/10016G06T2207/20101G06T3/04
Inventor 周昆柴蒙磊翁彦琳陈凯迪邵天甲
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products