Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human face shape classification method and system

A face shape and face technology, which is applied in the field of face shape classification methods and systems, can solve the problems of low accuracy of face shape classification, and achieve the effects of overcoming poor robustness, improving accuracy, and improving precision

Active Publication Date: 2017-06-30
湖南峰华智能科技有限公司
View PDF4 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Embodiments of the present invention provide a face classification method and system to solve the problem of low accuracy of traditional face classification

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human face shape classification method and system
  • Human face shape classification method and system
  • Human face shape classification method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] figure 1 It shows a flow chart of steps of a face type classification method according to Embodiment 1 of the present invention.

[0031] refer to figure 1 , the face classification method of the present embodiment comprises the following steps:

[0032] Step S100, acquiring 3D point cloud data and 2D image data of the target user's head.

[0033] In this step, the point data collection of the appearance surface of the scanned object obtained by the measuring instrument is called point cloud data, and the three-dimensional point cloud data is the point data collection of the appearance surface of the scanned object obtained by a three-dimensional image acquisition device such as a laser radar. In this embodiment, the scanning object is a human head. The 3D point cloud data includes 3D coordinate XYZ information.

[0034] In this embodiment, the 3D point cloud technology is applied to the field of face shape classification, and the model corresponding to the user's h...

Embodiment 2

[0049] image 3 It shows a flow chart of steps of a face type classification method according to Embodiment 2 of the present invention.

[0050] refer to image 3 , the face classification method of the present embodiment comprises the following steps:

[0051] Step S300, acquiring 3D point cloud data and 2D image data of the target user's head.

[0052] Specifically, the 3D point cloud data and 2D image data of the target user's head are acquired through an image acquisition device.

[0053] In this embodiment, the 3D point cloud data of multiple viewpoints of the target user's head is obtained by scanning, the 3D point cloud data includes data of multiple frames of the target user's head, and each frame of 3D point cloud data includes at least the data of the target user's head Point cloud data, wherein the Hough forest model detection method is used to perform three-dimensional detection on multi-frame three-dimensional point cloud data, and multiple initial head three-d...

Embodiment 3

[0100] Image 6 A structural block diagram of a face type classification system according to Embodiment 3 of the present invention is shown.

[0101] The face type classification system in this embodiment includes: a data acquisition module 600 for acquiring 3D point cloud data and 2D image data of the target user's head; a model generation module 602 for Generate a 3D model of the target user's head; a data mapping module 604, configured to map the 2D image data to the 3D model according to the correspondence between the 3D point cloud data and the 2D image data Middle; the region detection module 606, used for detecting the human face region according to the texture information and color information of the two-dimensional image data; the feature point extraction module 608, used for according to the mapped three-dimensional model and the described human face region Extracting face feature points; a face classification module 610, configured to classify the target user's fac...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiments of the invention provide a human face shape classification method and a system, wherein the method comprises: obtaining the three-dimensional point cloud data of a target user's head part and two-dimensional image data; according to the three-dimensional point cloud data, generating a three-dimensional model for the target user's head part; according to the corresponding relationship between the three-dimensional point cloud data and the two-dimensional image data, mapping the two-dimensional image data into the three-dimensional model; based on the texture information and the color information of the two-dimensional image data, detecting the human face area; according to the mapped three-dimensional model and the human face area, extracting the human face characteristics points; and according to the human face characteristics points, performing classification to the target user's human face shape. The embodiments of the invention increase the accuracy to extract the human face characteristics points, overcome the defects of poor Robustness of purely extracting human face characteristics points from a two-dimensional image and increase the accuracy for human face type classification.

Description

technical field [0001] Embodiments of the present invention relate to the technical field of computer vision, and in particular to a method and system for classifying human faces. Background technique [0002] Face feature parameters refer to information such as the position and shape of each part in the face image, which is an important part of face image analysis. It can be widely used in beauty and hairdressing, glasses wearing, plastic surgery and other fields. Moreover, the facial feature parameters are also an important basis for the classification of human faces. [0003] There are many face feature parameter extraction methods based on optical two-dimensional images, but the gray distribution of two-dimensional face images is more complex, and it is affected by the illumination of the imaging process, the size of the image, the distance, rotation and posture changes. , which makes it very difficult to extract face feature parameters correctly, and even the extractio...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00
CPCG06V40/172
Inventor 滕书华谭志国鲁敏
Owner 湖南峰华智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products