Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Face Gender Recognition Method Based on Fireworks Deep Belief Network

A deep belief network and gender recognition technology, which is applied in the field of face gender recognition by optimizing the initial parameter space of the deep belief network by the fireworks algorithm, and achieves the effect of global optimization, strong anti-interference and high recognition rate

Active Publication Date: 2019-06-28
SHAANXI NORMAL UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the error backpropagation method is easy to fall into local optimum

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face Gender Recognition Method Based on Fireworks Deep Belief Network
  • Face Gender Recognition Method Based on Fireworks Deep Belief Network
  • Face Gender Recognition Method Based on Fireworks Deep Belief Network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] Take the internationally accepted Extended Cohn-Kanade face database as the input image, MATLAB2010b as the experimental platform, and perform face gender recognition as an example, such as figure 1 As shown, the method is as follows:

[0045] 1. Raw image preprocessing

[0046] Extended Cohn-Kanade face database, with 210 training images and 140 test images, some images such as figure 2 As shown, will figure 2 The original color image is converted into a grayscale image, and the face part is segmented, and the size of the face image is sampled by bicubic interpolation method to 24×24 pixels, such as image 3 As shown, each segmented image is converted into a one-dimensional vector, and each row of vector represents an image.

[0047] 2. Training Deep Belief Network

[0048]Set the number of layers of the input layer, hidden layer and output layer of the deep belief network to 1 input layer, 3 hidden layers and 1 output layer respectively, where the number of node...

Embodiment 2

[0073] Taking the internationally accepted MORPH face database as the input image and MATLAB 2010b as the experimental platform, face gender recognition is taken as an example, such as figure 1 As shown, the method is as follows:

[0074] 1. Raw image preprocessing

[0075] MORPH face database, training images are 1400, test images are 1000, some images such as Figure 4 As shown, will Figure 4 The original color image is converted into a grayscale image, and the face part is segmented, and the size of the face image is sampled by bicubic interpolation method to 24×24 pixels, such as Figure 5 As shown, each segmented image is converted into a one-dimensional vector, and each row of vector represents an image.

[0076] 2. Training Deep Belief Network

[0077] Set the number of layers of the input layer, hidden layer and output layer of the deep belief network to 1 input layer, 3 hidden layers and 1 output layer respectively, where the number of nodes in the input layer is...

Embodiment 3

[0102] Taking the internationally accepted LFW face database as the input image, and MATLAB 2010b as the experimental platform, face gender recognition is taken as an example, such as figure 1 As shown, the method is as follows:

[0103] 1. Raw image preprocessing

[0104] LFW face database, 400 training images, 200 test images, some images such as Figure 6 As shown, will Figure 6 The original color image is converted into a grayscale image, and the face part is segmented, and the size of the face image is sampled by bicubic interpolation method to 24×24 pixels, such as Figure 7 As shown, each segmented image is converted into a one-dimensional vector, and each row of vector represents an image.

[0105] 2. Training Deep Belief Network

[0106] Set the number of layers of the input layer, hidden layer and output layer of the deep belief network to 1 input layer, 3 hidden layers and 1 output layer respectively, where the number of nodes in the input layer is 24×24, and t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A face gender recognition method based on Fireworks Deep Belief Network, which consists of original image preprocessing, training Deep Belief Network, optimizing the initial parameter space of Deep Belief Network with Fireworks Algorithm, and using the Deep Belief Network optimized by Fireworks Algorithm for face gender recognition . The present invention adopts the deep belief network to learn more abundant features of face image semantic information in the unsupervised stage, uses the firework algorithm to adjust the initial parameter space of the deep belief network in the supervised stage, and obtains a network model more suitable for recognition tasks. The invention has the advantages of strong anti-interference, global optimization, high recognition rate, etc., and can be used for face gender recognition and other image recognition and classification.

Description

technical field [0001] The invention belongs to the technical field of face image gender recognition, and in particular relates to a method for determining the face gender recognition by using a firework algorithm to optimize the initial parameter space of a deep belief network. Background technique [0002] Face gender recognition technology has a wide range of applications in human-computer interaction, machine vision, identity verification, security systems, etc., and has become a research hotspot in recent years. Gender recognition systems for face images usually consist of three parts: face detection, feature extraction, and recognition. Extract the underlying visual features of the face image, and use these features as the input of the classifier to identify the gender of the face. The extraction of facial features is the key to gender recognition, and the quality of feature selection will directly affect the subsequent recognition accuracy. The commonly used artific...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62
CPCG06V40/16G06F18/214G06F18/24
Inventor 郭敏王健马苗陈昱莅肖冰
Owner SHAANXI NORMAL UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products