Image annotation method and device and storage medium

An image labeling and storage medium technology, applied in the field of image labeling methods, devices and storage media, can solve the problems of large labeling noise, unfavorable model training, difficulty in facial images, etc. Effect

Pending Publication Date: 2019-11-15
TENCENT TECH (SHENZHEN) CO LTD
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, due to the ambiguity and subjectivity inherent in the definition of human facial image quality, it is very difficult to construct appropriate standards and get people to label the quality of facial images
For example, traditional methods often require people to rate individual face images on a scale of 1 to 5, which is very unreasonable because there is no consensus standard to define different face quality levels, or different Users have different standards for the division of face quality levels, that is, each user has different preferences. For example, the same face photo may be rated as 1 point by user a and 5 points by user b. Therefore, there are training samples determined by the above method. A lot of annotation noise is not conducive to model training

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image annotation method and device and storage medium
  • Image annotation method and device and storage medium
  • Image annotation method and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0033] Embodiments of the present application are described below in conjunction with the accompanying drawings.

[0034] Since the current labeling of face images is mostly manually scored on a single face image, the subjectivity is too great, so the training samples determined by the above method have a lot of labeling noise, which is not conducive to model training.

[0035] Therefore, the embodiment of the present application provides an image tagging method, which is different from the traditional method. By using two images to be tagged as a comparison pair for face quality comparison, the basis for user judgment has a measurable objective standard, reducing subjective influence. And based on the comparison result, the marked image can be used as a node to construct a balanced binary tree. In any target node in the balanced binary tree, the child nodes on different sides of the target node have different face quality comparison results. Therefore, when comparing the fac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The embodiment of the invention discloses an image annotation method, and the method carries out the face quality comparison through employing two to-be-annotated images as a comparison pair, enablesa judgment basis of a user to have a measurable objective standard, and reduces the subjective impact. On the basis of the comparison result, the marked image is used as a node to construct a balancedbinary tree, and in any target node in the balanced binary tree, sub-nodes on different sides of the target node are different from the face quality comparison result of the target node. When face quality comparison is carried out on the target to-be-annotated image subsequently; an image related to the balanced binary tree can be selected from the balanced binary tree to form a comparison pair,wherein the face quality comparison result of the ith to-be-compared image and the (i-1) th to-be-compared image is the same as that of the target to-be-labeled image and the (i-1) th to-be-compared image in the (i-1) th comparison pair. Th number of comparison pairs in the process of identifying the to-be-annotated image set is reduced, and the annotation efficiency of the face image is improved.

Description

technical field [0001] The present application relates to the field of image processing, in particular to an image labeling method, device and storage medium. Background technique [0002] At present, images with human faces can be automatically identified through a trained network model, and the quality of the human face in the image can be determined. One way to train the network model is to train with labeled training samples. [0003] In the field of face quality recognition, the training samples used to train the face quality recognition model can be obtained by manual labeling. [0004] However, due to the ambiguity and subjectivity inherent in the definition of human facial image quality, it is very difficult to construct appropriate standards and get people to label the quality of facial images. For example, traditional methods often require people to rate individual face images on a scale of 1 to 5, which is very unreasonable because there is no consensus standard...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00
CPCG06T7/0002G06T2207/10004G06T2207/30168G06T2207/30201
Inventor 郭子滨陈星宇陈超李绍欣黄飞跃吴永坚
Owner TENCENT TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products