Human face comparison method and device

A face comparison and face image technology, applied in the field of face comparison, can solve problems such as large amount of data, high cost, and multi-check workload, and achieve the effect of accurate locking.

Inactive Publication Date: 2017-11-07
深圳市深网视界科技有限公司
View PDF4 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to overcome the deficiencies in the prior art, one of the purposes of the present invention is to provide a method for face comparison, which can solve the problem that the large amount of data in the comparison results of a large database brings more investigation workload to the user, and the corresponding The cost of manual further screening will also be very high; face comparison based on global features, for personalized objects, cannot use individual features for face comparison, and cannot quickly and accurately lock similar objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human face comparison method and device
  • Human face comparison method and device
  • Human face comparison method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0060] Such as figure 1 A method for face comparison, comprising the following steps:

[0061] Step S110, acquiring a face image, and dividing the face image into a first sub-region, a second sub-region and a third sub-region.

[0062] Specifically, the first sub-region, the second sub-region and the third sub-region are one or more of human face regions such as eye region, nose region, mouth region, ear region, cheek region or forehead region. and the first sub-region, the second sub-region and the third sub-region are different from each other; the first sub-region, the second sub-region and the third sub-region respectively correspond to the first sub-region , the second sub-region and the third sub-region. For example, the first sub-area and the first sub-area are the eye area, the second sub-area and the second sub-area are the nose area, the third sub-area and the third sub-area are people including the cheek area and the ear area The face area, of course, may also in...

Embodiment 2

[0093] Such as Figure 5 The method for face comparison shown includes the following steps:

[0094] Step S210, acquiring a face image, and dividing the face image into a first sub-region, a second sub-region and a third sub-region.

[0095] Step S220, extracting global face features of the face image, first sub-features of the first sub-region, second sub-features of the second sub-region, and third sub-features of the third sub-region.

[0096] Step S230, acquiring a first comparison image, and dividing the first comparison image into a first sub-region, a second sub-region and a third sub-region.

[0097] In this embodiment, the acquiring the first comparison image is specifically acquiring the first comparison image from a comparison library, where the comparison library includes a plurality of comparison images. The comparison library is the aforementioned large library, which includes a large amount of face information, such as snapshots and ID photos of N individuals....

Embodiment 3

[0109] Such as Image 6 The devices shown for face comparison include:

[0110] The first acquiring module 110 is configured to acquire a human face image, and divide the human face image into a first sub-region, a second sub-region and a third sub-region;

[0111] The first extraction module 120 is used to extract the global facial feature of the human face image, the first sub-feature of the first sub-region, the second sub-feature of the second sub-region and the third sub-region of the third sub-region. feature;

[0112] The second acquisition module 130 is configured to acquire a first comparison image, and divide the first comparison image into a first sub-area, a second sub-area and a third sub-area;

[0113] The second extraction module 140 is configured to extract the global contrast feature of the first contrast image, the first sub-feature of the first sub-region, the second sub-feature of the second sub-region and the third sub-character of the third sub-region ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a human face comparison method and device. The human face comparison method comprises the steps of acquiring a human face image; extracting a global face feature, a first componential feature, a second componential feature and a third componential feature; acquiring a comparison image; extracting a global comparison feature, a first sub-feature, a second sub-feature and a third sub-feature; performing similarity comparison on the global comparison feature and the sub-features of the comparison image and the global face feature and the componential features of the human face image respectively to obtain the corresponding global similarity, first similarity, second similarity and third similarity; performing weighting summation on the similarities to obtain the comprehensive similarity. According to the invention, local features and global features of the human face are integrated and applied to human face comparison, and the comprehensive similarity can better reflect the similarity between the comparison image and the human face image. In addition, the similarity or the dissimilarity of a personalized region is magnified through increasing the corresponding weight, and a similar object of the human face image can be locked quickly and accurately.

Description

technical field [0001] The invention relates to face comparison technology, in particular to a method and device for face comparison. Background technique [0002] Face comparison refers to extracting face features Ft(A) and Ft(B) respectively for two given face pictures A and B, and reflecting the similarity of faces by calculating the feature similarity S, The larger S is, the higher the similarity between face pictures A and B is. Set a threshold t, if S>t, it means that A and B are the same person. [0003] Usually, a snapshot of A is provided, and it is compared with images of N individuals in the database to determine A's identity. However, the number of N is very large, ranging from millions to tens of millions, so it is also called large database comparison. In the existing large database comparison, the number of N is often relatively large, above 100W data level. Even if the false positive rate is 1 / 1000, there will be 100W*0.1%=1k false positives, and the a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00
CPCG06V40/171G06V40/172G06V10/95
Inventor 马东宇龚丽君赵瑞
Owner 深圳市深网视界科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products