Image sentiment classification method based on class activation mapping and visual saliency

An emotion classification and image technology, applied in neural learning methods, character and pattern recognition, instruments, etc., can solve problems such as underutilization and limited performance of emotion classification

Active Publication Date: 2020-10-27
GUILIN UNIV OF ELECTRONIC TECH
View PDF9 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] Aiming at the problem that the existing image emotion classification methods only consider the overall information of the image and do not make full use of the important local area

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image sentiment classification method based on class activation mapping and visual saliency
  • Image sentiment classification method based on class activation mapping and visual saliency
  • Image sentiment classification method based on class activation mapping and visual saliency

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] Such as figure 1 As shown, an image sentiment classification method based on class activation mapping and visual saliency includes the following steps:

[0061] S1: Prepare the emotional image dataset for training the model, expand the dataset, and adjust the size of the image samples in the dataset to 448×448×3;

[0062] S2: Extract the overall feature F of each image through the overall feature extraction network of the model;

[0063] S3: Generate an image saliency map and extract its salient region feature F through the salient region feature extraction network of the model S ;

[0064] S4: Generate the image emotion distribution map and extract the emotional region feature F through the emotional region feature extraction network of the model M ;

[0065] S5: Fusion of overall feature F and local feature F S , F M , get the discriminative features, and generate the semantic vector d through the global average pooling operation;

[0066] S6: Input the semanti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides an image emotion classification method based on class activation mapping and visual saliency, and relates to the technical field of computer vision and image processing. The method comprises the following steps: firstly, overall features of an image are extracted through a deep convolutional neural network; saliency detection is carried out on an image by using a multi-scalefull convolutional neural network to further obtain saliency region features of the image, and meanwhile, an emotion distribution diagram of the image is generated through class activation mapping and emotion region features are extracted only by using an emotion label of an image level. The saliency region features and the emotion region features of the image are regarded as local representations of the image, and are further fused with the overall features of the image to obtain more discriminative visual features which are used for visual emotion classification. According to the method, the overall information of the image is considered, the information of the important local area in the image is fully utilized, and meanwhile, only the picture-level emotion label is needed, so that thelabeling burden is greatly reduced.

Description

technical field [0001] The invention belongs to the technical field of computer vision and image processing, and in particular relates to an image emotion classification method based on class activation mapping and visual salience. Background technique [0002] As a platform for users to create and share information, social media has become an important part of people's lives. Every day, more and more people publish massive multimedia content through social media to express their opinions and emotions. Sentiment analysis for these user-generated data can effectively analyze user behavior and psychology, and discover user needs, which has important application value. With the increasing amount of visual content posted by users in social media, image sentiment classification has attracted extensive attention. [0003] Different from object recognition tasks, image emotion involves high-level abstraction and cognitive subjectivity, so image emotion recognition is a more challe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V10/462G06N3/047G06N3/048G06N3/045G06F18/241G06F18/2415
Inventor 蔡国永储阳阳
Owner GUILIN UNIV OF ELECTRONIC TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products