Human face and body fusion processing method and system

A technology that integrates processing and body, applied in image data processing, instrument, character and pattern recognition, etc., can solve problems such as abrupt transition pixels, large color gap, and unnaturalness

Active Publication Date: 2018-02-09
深圳市云之梦科技有限公司
View PDF4 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The present invention solves the problem of unnatural transition between the skin color of the human face and chin and the skin color of the neck and neck area of ​​the body of the standard model: due to the different skin colors of different people's faces and the different lighting environments for photographing, the skin color of different faces cannot be naturally transitioned from the skin color of the fixed body of the standard model; Since there are transition pixels fr

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human face and body fusion processing method and system
  • Human face and body fusion processing method and system
  • Human face and body fusion processing method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] An embodiment of the present invention provides a method for fusion processing of human face and body, including:

[0044] The head area acquisition process;

[0045] Migrate the body color of the standard model to the skin color of the face;

[0046] Set the fusion background area;

[0047] Perform Poisson fusion.

[0048] In a preferred embodiment, in the embodiment of the present invention, the image of the human body to be processed is captured by the head segmentation technology to obtain the image of the head area with hair.

[0049] A preferred embodiment, the head segmentation technology in this embodiment is realized based on the head segmentation convolutional neural network, the preferred embodiment, in this embodiment, the skin color migration is realized by aligning the mean value and variance of the pixel color channel, including:

[0050] Obtain the key points of the face and remove the facial features to find the correct skin color area of ​​the face; ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a human face and body fusion processing method. The method comprises the steps of obtaining a human head region; migrating a standard model body color to a human face skin color; setting a fusion background region; and performing Poisson fusion. The invention provides a human face and body fusion processing method and system. On one hand, the difference of human face skin colors of different people is relatively large and the fused target body color is fixed, so that for solving the problem of unnatural fusion of a human face image and a target body due to excessively large color (height) difference, the method for migrating the standard model body color to the human face skin color is adopted for ensuring that preprocessing before fusion can achieve the purpose of color natural transition. On the other hand, by limiting a boundary condition, transitional pixels of a human face jaw and the human head region are fused with a neck intersection region of a human body to remove the influence of the transitional pixels of the human face jaw.

Description

technical field [0001] The invention relates to the field of computer graphics, in particular to a method and system for fusion processing of human face and body. Background technique [0002] With the development of human society, there are more and more choices of clothes. In the fast-paced life, people have the need to try on clothes without changing clothes, so virtual fitting products came into being. Virtual fitting technology can be roughly divided into two types: 2D and 3D. The models and clothing based on 3D virtual fitting technology are all made in 3D. The advantage is that the clothes and template body shapes can be seamlessly connected in 3D, and it has 360-degree visualization. The disadvantage is that 3D scanning of each piece of clothing is time-consuming and labor-intensive. Based on 2D virtual fitting technology, 2D image stitching is adopted, and the standard model is projected from 3D to 2D or directly in the form of 2D pictures. The advantage is that ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T5/50G06K9/00
CPCG06T5/50G06T2207/30201G06V40/162G06V40/165
Inventor 芦爱余
Owner 深圳市云之梦科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products