Unlock instant, AI-driven research and patent intelligence for your innovation.

Facial expression generation method and device, mobile terminal and storage medium

An expression and user technology, applied in the field of image processing, can solve problems such as distortion of expression pictures, mismatch between expression templates and user photos, and poor synthesis effect of custom expression pictures.

Inactive Publication Date: 2018-02-23
朱秋华
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the emoticon images obtained by the latter method do not match the emoticon template and the user's photo, resulting in poor synthesis of custom emoticon images, and even emoticon image distortion.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Facial expression generation method and device, mobile terminal and storage medium
  • Facial expression generation method and device, mobile terminal and storage medium
  • Facial expression generation method and device, mobile terminal and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0028] figure 1 It is a flow chart of a method for generating expressions provided in the first embodiment of the present invention. This embodiment can be applied to a mobile terminal that can take pictures. The method is executed by a device that generates expressions, and the device can be implemented by software and / or hardware. Implementation, generally can be integrated in a mobile terminal, the method specifically includes the following steps:

[0029] S110: Acquire an image of the user's face area.

[0030] In the embodiment of the present invention, the user's facial area image is intercepted in the user's self-portrait photo image.

[0031] S120: Acquire facial feature information of the user according to the image.

[0032] In an embodiment of the present invention, optionally, acquiring the facial feature information of the user based on the image includes: selecting multiple feature regions based on the image; using each region as an expression feature unit, and Determin...

Embodiment 2

[0042] figure 2 It is a flowchart of a method for generating an expression provided by the second embodiment of the present invention. This embodiment is optimized on the basis of the above-mentioned embodiment. The method specifically includes the following steps:

[0043] S210: Acquire an image of the user's face area.

[0044] S220: Acquire facial feature information of the user according to the image.

[0045] S230: Predefine an expression template, extract expression feature units of the template according to a template selected by the user, and determine facial feature information of the template, where the template includes at least one expression feature unit.

[0046] In the embodiment of the present invention, the expression template is preset, and the template includes the feature point information of at least one expression feature unit. The user selects the template in the image processing software gallery, and determines the expression adjustment parameters according to ...

Embodiment 3

[0052] image 3 It is a flowchart of a method for generating an expression provided in the third embodiment of the present invention. This embodiment is optimized on the basis of the above-mentioned first embodiment. The method specifically includes the following steps:

[0053] S310: Acquire an image of the user's face area.

[0054] S320: Acquire facial feature information of the user according to the image.

[0055] S330: Determine feature points of each expression feature unit according to the facial feature information of the user.

[0056] S340: Obtain the user's operation for adjusting the feature point and use it as the user's expression adjustment parameter.

[0057] In the embodiment of the present invention, the user can directly input an expression adjustment operation, and obtain expression adjustment parameters according to the user adjustment expression operation. For example, to obtain the operation of adjusting the eyes to smile, the characteristic points of the upper ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a facial expression generation method and device, a mobile terminal and a storage medium. The facial expression generation method comprises steps of obtaining a facial area image of a user, obtaining facial characteristic information of the user according to the image, obtaining user facial expression regulation parameters and generating a facial expression according to thefacial characteristic information of the user and the facial expression regulation parameters. The facial expression generation method and device, the mobile terminal and the storage medium can generate the facial expression according to the human face image.

Description

Technical field [0001] The embodiments of the present invention relate to image processing technology, and in particular to a method, device, mobile terminal, and storage medium for generating expressions. Background technique [0002] With the development of technology, social behaviors on the Internet have become people's daily behaviors. Emoji language is a commonly used interactive information in social applications and has formed a kind of popular culture. [0003] However, the emoticon pictures are mostly pictures obtained from the Internet, or through image processing software to synthesize custom emoticon pictures based on the user's photos. However, the emoticon image obtained by the latter method does not match the emoticon template with the user's photo, resulting in poor synthesis of the custom emoticon image, and even distortion of the emoticon image. Summary of the invention [0004] In view of this, the embodiments of the present invention provide a method, a device...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T11/00G06K9/00
CPCG06T11/00G06V40/168
Inventor 朱秋华
Owner 朱秋华