Face part identification method and device

A part and face technology, applied in character and pattern recognition, computer parts, instruments, etc., can solve the problems of inability to effectively obtain multiple face parts at one time, inability to effectively meet real-time analysis, and poor recognition effect. Achieve good uniqueness and spatial invariance, improve detection and recognition efficiency, strong classification and learning ability

Active Publication Date: 2016-11-09
HUNAN VISUALTOURING INFORMATION TECH CO LTD
View PDF2 Cites 19 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Traditional face recognition solutions are mainly aimed at two-dimensional visible light images. Two-dimensional visible light images are easily affected by factors such as image resolution, lighting, and shooting angles, resulting in poor recognition results and low recognition accuracy.
Moreover, the traditional face recognition method only recognizes one part of the face, and cannot effectively obtain multiple face parts at one time when performing expression recognition, face reconstruction, and facial gesture recognition, and cannot effectively satisfy real-time analysis needs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face part identification method and device
  • Face part identification method and device
  • Face part identification method and device

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0058] Please refer to figure 1 , figure 1 A specific flow chart of the method for face recognition is provided for this embodiment, and the method includes:

[0059] Step S110, obtaining a depth image.

[0060] In this embodiment, the depth image is obtained by a depth sensor, wherein the depth image includes a depth value of each pixel obtained by the depth sensor.

[0061] Please refer to figure 2 , assuming that the field angle of the depth sensor in this embodiment is (α, β), and the resolution of the obtained depth image is (m, n). Coordinates are established on the depth image in units of pixels, and the depth value of the pixel p=(x, y) is recorded as D(x, y).

[0062] Step S120, extracting image pixel features in the depth image.

[0063] Extracting the image pixel features may include: depth gradient direction histogram features, local simplified ternary pattern features, depth value statistical distribution features, and depth difference features between other...

no. 2 example

[0103] Please refer to Figure 7 , the face recognition device 10 provided in this embodiment includes:

[0104] A first acquisition module 110, configured to acquire a depth image;

[0105] A first feature extraction module 120, configured to extract image pixel features in the depth image;

[0106] Human face deep learning module 130, for identifying and classifying the input image pixel features;

[0107] Judgment module 140, is used for judging whether the classification of described image pixel feature matches with the existing facial part label in described human face deep learning model;

[0108] The output module 150 is configured to output a label corresponding to the pixel feature when the classification of the image pixel feature matches the existing label in the face deep learning model.

[0109] In this embodiment, the face deep learning model is used to use the image pixel features as the input of its bottom input layer, perform regression classification at th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a face part identification method and device. The method comprises: obtaining a depth image; extracting image pixel characteristics in the depth image; inputting the image pixel characteristics into a face deep learning model to perform identification and classification; determining whether the classification of the image pixel characteristics matches existing face part labels in the face deep learning model; and if the classification of the image pixel characteristics matches existing face part labels in the face deep learning model, outputting the labels corresponding to the image pixel characteristics. According to the invention, a depth image characteristic extraction method is employed to guarantee extraction accuracy, and a deep learning model is employed to identify image pixel characteristics, and is capable of performing identification and classification on a plurality of face parts once.

Description

technical field [0001] The present invention relates to the technical field of face recognition, in particular to a method and device for face recognition. Background technique [0002] Face recognition is an important part of face recognition analysis technology, it can be widely used in face detection and positioning, face recognition, gesture recognition, 3D face reconstruction, facial animation, face portrait generation, Fields such as head tracking and hand-free mouse-free human-computer interaction for the disabled. Processing face images and performing face recognition has become a research hotspot in this field. [0003] Traditional face recognition solutions are mainly aimed at two-dimensional visible light images. Two-dimensional visible light images are easily affected by factors such as image resolution, lighting, and shooting angles, resulting in poor recognition results and low recognition accuracy. Moreover, the traditional face recognition method only recog...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/172G06F18/214
Inventor 谭志国杨阿峰李洪
Owner HUNAN VISUALTOURING INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products