Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Expression image marking method and system

An image annotation and image technology, applied in the computer field, can solve the problems of emotional experience interference, inability to guarantee the objectivity and accuracy of data annotation, and inability to guarantee the accuracy of recall, so as to avoid tedious and complicated work, ensure objectivity and accuracy. The effect of accuracy

Inactive Publication Date: 2017-01-18
HAINAN UNIVERSITY
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The real-time recording method generally uses the traditional paper questionnaire method to allow the subjects to record their emotional state while watching the video. This method can guarantee the authenticity and objectivity of the data from a theoretical perspective, but the unavoidable problem is that the subjects in the Recording emotional data at a certain moment will definitely interfere with the emotional experience at the next moment
The recall recording method is for the subjects to recall their emotions when watching the video after watching the video. This method can avoid the influence of the real-time annotation at the previous moment on the subjects watching the video at the next moment, but it cannot guarantee that the subjects will remember each time. Accuracy of momentary recall
[0004] In short, no matter it is the real-time recording method or the recall recording method, they only mark the key frames of the subjects, and cannot guarantee the objectivity and accuracy of the data labeling; The video is processed again to complete the emotion / expression segmentation, and then the segmented emotion / expression images are annotated according to the subject's annotation information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Expression image marking method and system
  • Expression image marking method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0034] figure 1 It is a schematic flow chart of the method for emoticon image labeling provided by the embodiment of the present invention, see figure 1 , the emoticon image labeling method includes:

[0035] S1: Synchronously operate the shooting video and the induced video, and when the induced video plays to the preset time, implement th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an expression image marking method and system. An induction video and a shooting video collected in an experiment process can be operated synchronously, and the induction video can help a testee recall emotion change when watching the induction video. Playing of the shooting video is paused the moment that the induction video causes emotion change of the testee, a head and shoulder image of the testee is captured from the shooting video, and the captured image is marked with the expression type and intensity. According to the method and system, the shooting video and the induction video are operated synchronously to help the testee mark expressions during video watching, and the objectivity and accuracy of marking data are ensured. The marking data generated in the marking process is stored in a database automatically, workers are prevented from tedious and complex work after the experiment, and objective accurate expression images and marking information thereof can be obtained in a simple and convenient method.

Description

technical field [0001] The invention belongs to the technical field of computers, and in particular relates to a method and a system for emoticon image labeling. Background technique [0002] The facial expression recognition technology has received widespread attention and has gradually become a research hotspot. However, since the existing facial expression images cannot support the in-depth study of complex algorithms, it is necessary to re-establish the facial expression database. The construction of the existing expression database mainly has two forms: image-based and video-based. The image-based expression database is the expression image collected by the camera, which does not involve the problem of expression segmentation, and the image can be directly marked. This method compares Simple. The expression library based on video is the expression video collected by the camera, and the expression image is marked after the expression image is segmented from the video. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N21/433H04N21/44G06K9/00
CPCH04N21/4333H04N21/44008G06V40/174
Inventor 刘永娜王利绒
Owner HAINAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products