Augmented reality-based human face interaction entertainment method

A technology of real people and face features, applied in the field of augmented reality face interactive entertainment, can solve problems such as limiting the scope of use, and achieve the effect of easy promotion, natural effect, and good followability

Inactive Publication Date: 2017-02-01
SUZHOU LIDUO DIGITAL TECH CO LTD
View PDF5 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Estimating the 3D position and posture of the face generally requires knowing the 3D model structure information of the face. Currently, the 3D information of the face can be obtained in real time through an RGB-D camera (such as Microsoft’s Kinect sensor), but this type of technology requires a special sensor, so greatly limited its scope of use

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Augmented reality-based human face interaction entertainment method
  • Augmented reality-based human face interaction entertainment method
  • Augmented reality-based human face interaction entertainment method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0033] see figure 1 Shown, a kind of augmented reality human face interactive entertainment method, it comprises the steps:

[0034] Step 1. Making standard 3D face models and 3D facial materials in advance;

[0035] Step 2, input the face video, and detect 2D face feature points in each frame of the input face video;

[0036] Step 3, estimating the 3D position and posture of the face in the video frame in combination with the standard 3D face model and 2D face feature points;

[0037] Step 4, according to the 3D position and posture of the human face obtained in step 3, the pre-made facial three-dimensional material is converted to the same posture of the human face in the video, and the rendering is superimposed on the human face video to realize the augmented reality effect;

[0038] Step 5. Record and save the face augmented reality video and share it with your friends for interaction.

[0039] In this embodiment, the standard human face three-dimensional model is made ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an augmented reality-based human face interaction entertainment method. The method comprises the following steps of 1, pre-building / pre-making a standard human face three-dimensional model and a face three-dimensional material; 2, inputting a human face video and detecting 2D human face feature points at each frame in the input human face video; 3, estimating 3D position and posture of a human face in video frames in combination with the standard human face three-dimensional model and the 2D human face feature points; 4, converting the pre-made face three-dimensional material to be in the same posture of the human face in the video according to the 3D position and posture, obtained in the step 3, of the human face, and performing rendering and superposition in the human face video to achieve an augmented reality effect; and 5, recording and storing an augmented reality effect video of the human face, and sharing the augmented reality effect video with friends for interaction. According to the method, the human face video can be shot only by a mobile phone camera without external devices such as a special sensor and the like; and the method is not limited in application range, is convenient for popularization, and has very good following performance and natural effects.

Description

technical field [0001] The invention belongs to the technical fields of computer vision and computer graphics, and in particular relates to an augmented reality face interactive entertainment method. Background technique [0002] Augmented reality face interactive entertainment technology mainly shoots face videos through mobile phones, and further estimates the 3D position and posture of the face by tracking the feature points of the face online in real time, so that the preset 3D face material is superimposed on the face video and Follow the rotation and movement of the face to get a novel augmented reality interactive effect. Further, it can also realize interaction by recording a video and sharing it with friends. Since facial movements are an important way to convey human emotions, the above technologies are widely used in human-computer interaction, film and television advertisements, game production, video conferencing, etc. [0003] The current augmented reality sol...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06T19/20
CPCG06T17/00G06T19/20
Inventor 许奇明
Owner SUZHOU LIDUO DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products