Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Real-time real-person virtual hair try-on method based on 3D face tracking

A 3D, real person technology, applied in the field of real-time real person virtual test hair, can solve problems such as lack of realism, and achieve the effect of enhancing experience, increasing realism, and avoiding time-consuming calculation.

Active Publication Date: 2020-12-22
ZHEJIANG GONGSHANG UNIVERSITY
View PDF4 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the lack of realism, immersion, and time-consuming calculation of the existing virtual trial technology, the present invention proposes a real-time real-time virtual trial method based on 3D face tracking

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real-time real-person virtual hair try-on method based on 3D face tracking
  • Real-time real-person virtual hair try-on method based on 3D face tracking
  • Real-time real-person virtual hair try-on method based on 3D face tracking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The invention provides a real-time real person virtual trial method based on 3D face tracking. The user collects video frames through an ordinary webcam, and the algorithm automatically fits the 3D hair model to the position of the user's face and head in the video frame, and performs augmented reality drawing, so that the user can watch the test hair combining virtual and real in real time Effect.

[0038] The technical scheme step that the present invention adopts is as follows:

[0039] Part 1: Real-time 3D face tracking for virtual trial launch

[0040]1) Use the lightweight MobileNet (a deep neural network model structure) as the backbone neural network of the 3D face feature point regression algorithm. This network model can balance accuracy and computational efficiency. Compared with 2D face feature points, the present invention adopts 3D face feature points because it can better express the position and posture of the 3D face model in 3D space, and when the fa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a real-time real-person virtual hair try-on method based on 3D face tracking. The method comprises the following steps: firstly, carrying out real-time 3D face tracking for virtual hair try-on; then three-dimensional hair model wearing based on orientation consistency is carried out; and finally, re-coloring the three-dimensional hair model keeping the color difference ofadjacent pixels. According to the method, through the lightweight model and the 3D face feature points, the problems of calculation time consumption and unstable tracking result generation caused by association of the 2D face feature points and the vertexes of the three-dimensional face model are avoided, so that a rapid and accurate tracking result is realized. And the registration of the three-dimensional hair model enables the try-on hair to be more accurately fit with the real face, so that the authenticity of the virtual try-on hair is improved. In addition, a method for changing the texture color of the three-dimensional hair model is added, so that the hair try-on experience of a user and the functionality of a hair try-on system are enhanced.

Description

technical field [0001] The invention belongs to the fields of computer graphics and computer vision, and in particular relates to a real-time real-person virtual trial method based on 3D face tracking. Background technique [0002] Hair is an obvious feature of personal image. In today's personalized era, people pay more and more attention to personal image. Finding a suitable hairstyle or wearing a suitable wig has become a natural choice in people's daily work. Compared with wig try-on in physical stores, virtual hair try-on is more convenient, less costly, and has a wider range of applications. It can be widely used in virtual social networking, online mall try-on, and personal hair styling. [0003] The current mainstream virtual hair trial system is based on two-dimensional pictures: the user inputs a face photo and chooses a specific hairstyle, and the system automatically puts the hair of the corresponding hairstyle on the head of the face in the photo. Obviously, in ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T19/00G06K9/00G06N3/08
CPCG06T17/00G06T19/006G06N3/08G06V40/174G06V40/168
Inventor 唐博奕杨文武杨柏林
Owner ZHEJIANG GONGSHANG UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products