Unlock instant, AI-driven research and patent intelligence for your innovation.

Local face matching method and system, storage medium and electronic equipment

A matching method and partial technology, applied in neural learning methods, computer components, instruments, etc., can solve problems such as limitations of application scenarios

Pending Publication Date: 2021-11-16
SHANGHAI MININGLAMP ARTIFICIAL INTELLIGENCE GRP CO LTD
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The embodiment of the present application provides a partial face matching method, system, storage medium and electronic equipment, so as to at least solve the problem of limited application scenarios of the existing partial face matching method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Local face matching method and system, storage medium and electronic equipment
  • Local face matching method and system, storage medium and electronic equipment
  • Local face matching method and system, storage medium and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0056] Please refer to figure 1 , figure 1 It is a flow chart of the matching method of partial faces. Such as figure 1 Shown, the matching method of partial human face of the present invention comprises:

[0057] Preprocessing step S1: Preprocessing the pre-collected unoccluded first face picture to obtain a face sample, using a neural network to extract features from the face sample to obtain the first face feature and storing it in a database;

[0058] Step S2 of obtaining face features: collecting a second face picture in real time, and obtaining the second face feature after preprocessing the second face picture:

[0059] Matching step S3: comparing the second facial feature with the first facial feature in sequence, and using the detected first facial feature at the closest distance as the matching result of the second facial image output.

[0060] Please refer to figure 2 , figure 2 is a flowchart of the preprocessing step S1. Such as figure 2 As shown, the ...

Embodiment 2

[0085] Please refer to Figure 5 , Figure 5 It is a structural schematic diagram of the partial human face matching system of the present invention. Such as Figure 5 The matching system of a kind of partial face of the present invention is shown, wherein, comprises:

[0086] A preprocessing module, the preprocessing module preprocesses the pre-collected unoccluded first face picture to obtain a face sample, uses a neural network to extract features from the face sample to obtain the first face feature and stored in the database;

[0087] Obtain the face feature module, the described acquisition face feature module collects the second face picture in real time, obtains the second face feature after preprocessing the second face picture:

[0088] A matching module, the matching module compares the second facial feature with the first facial feature in turn, and uses the detected first facial feature at the shortest distance as the second facial image output of matching re...

Embodiment 3

[0100] combine Image 6 As shown, this embodiment discloses a specific implementation manner of an electronic device. The electronic device may include a processor 81 and a memory 82 storing computer program instructions.

[0101] Specifically, the processor 81 may include a central processing unit (CPU), or an Application Specific Integrated Circuit (ASIC for short), or may be configured to implement one or more integrated circuits in the embodiments of the present application.

[0102]Among them, the memory 82 may include mass storage for data or instructions. For example without limitation, the memory 82 may include a hard disk drive (Hard Disk Drive, referred to as HDD), a floppy disk drive, a solid state drive (SolidState Drive, referred to as SSD), flash memory, optical disk, magneto-optical disk, magnetic tape or universal serial bus (Universal Serial Bus, referred to as USB) drive or a combination of two or more of the above. Storage 82 may comprise removable or non...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a local face matching method and system, a storage medium and electronic equipment. The matching method comprises the following steps: a preprocessing step: preprocessing a pre-collected unshielded first face picture to obtain a face sample, extracting features of the face sample by using a neural network to obtain first face features, and storing the first face features in a database; a face feature obtaining step: collecting a second face picture in real time, and preprocessing the second face picture to obtain a second face feature; and a matching step: sequentially comparing the second face features with the first face features, and outputting the detected first face feature with the shortest distance as a matching result of the second face picture. The human face shielding object is cut and spliced again, and a network structure is input, so that application in different scenes is realized.

Description

technical field [0001] The invention belongs to the field of partial human face matching, and in particular relates to a partial human face matching method, system, storage medium and electronic equipment. Background technique [0002] Deep learning face feature extraction process: The process of face feature extraction is generally face detection, face key point detection, face correction, face feature extraction, and face feature comparison. Specifically, the face recognition process first detects the location of the face in the picture, and after cutting the location of the face, obtains the location of key points such as facial features, and obtains the pose of the face by calculating the mathematical relationship between the key points and Correct the picture. Input the cropped and rectified frontal face image into the face feature comparison network to obtain the feature vector of the face. Finally, the eigenvectors between the faces are compared to obtain whether th...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/00G06N3/08
CPCG06N3/08G06F18/22
Inventor 苏安炀唐大闰
Owner SHANGHAI MININGLAMP ARTIFICIAL INTELLIGENCE GRP CO LTD
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More