Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Expression representation model training method, facial expression representation method and corresponding devices

A training method and representation technology, applied in the field of image processing, can solve the problem of low accuracy rate, achieve the effect of improving accuracy rate and alleviating low accuracy rate

Pending Publication Date: 2021-05-14
NETEASE (HANGZHOU) NETWORK CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of this application is to provide a training method for an expression representation model, a facial expression representation method and a device, so as to alleviate the technical problem of low accuracy of the current facial expression representation method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Expression representation model training method, facial expression representation method and corresponding devices
  • Expression representation model training method, facial expression representation method and corresponding devices
  • Expression representation model training method, facial expression representation method and corresponding devices

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the present application will be clearly and completely described below in conjunction with the accompanying drawings. Obviously, the described embodiments are part of the embodiments of the present application, not all of them. the embodiment. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0052] The terms "including" and "having" mentioned in the embodiments of the present application and any variations thereof are intended to cover non-exclusive inclusion. For example, a process, method, system, product or device comprising a series of steps or units is not limited to the listed steps or units, but optionally also includes other unlisted steps or units, or optionally ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an expression representation model training method, facial expression representation method and corresponding devices and relates to the technical field of image processing, and solves the technical problem of low accuracy of existing facial expression representation. The method comprises the steps that: a sample set is determined, and each sample in the sample set comprises a sample image and a sample label; and a to-be-trained expression representation model is trained by using the sample set, so that a trained expression representation model can be obtained, wherein the to-be-trained expression representation model includes a to-be-trained full face representation sub-model and a trained identity representation sub-model, wherein the trained expression representation model comprises a trained full face representation sub-model and a trained identity representation sub-model, and the output of the trained expression representation model is determined based on the difference value between the output of the trained full face representation sub-model and the output of the trained identity representation sub-model.

Description

technical field [0001] The present application relates to the technical field of image processing, in particular to a method for training an expression representation model, a method and a device for representing facial expressions. Background technique [0002] Humans have the instinct to perceive expressions, but this instinct is not possessed by machines. Accurate representations of expressions can promote the understanding of human emotions by machines, which provides a basis for building a friendly, intelligent, and harmonious human-computer interaction system. It is an important technical basis, and through the rational use of various expression representation methods, it can help promote the development of multiple related downstream tasks, including expression image retrieval, emotion recognition, facial action unit (Action Unit, AU) recognition, facial expression generation, etc. [0003] Currently, methods for detecting facial expression representations include AU-...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/161G06V40/174G06N3/045G06F18/22G06F18/214
Inventor 张唯冀先朋丁彧李林橙范长杰胡志鹏
Owner NETEASE (HANGZHOU) NETWORK CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products