Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fusion feature-based cheongsam image emotional semantic recognition method

A technology that integrates features and semantic recognition, and is used in image enhancement, image analysis, image data processing, etc.

Active Publication Date: 2017-09-15
杭州浙泉网络科技有限公司
View PDF4 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, most of the research on clothing images focuses on object recognition, and there are relatively few studies on clothing images from the perspective of emotional semantics.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fusion feature-based cheongsam image emotional semantic recognition method
  • Fusion feature-based cheongsam image emotional semantic recognition method
  • Fusion feature-based cheongsam image emotional semantic recognition method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0125] Step 1. Adjective selection and establishment of emotional space:

[0126] 1.1 From the 30 adjectives for clothing pictures, select a representative group of adjectives (simple and steady, noble and elegant, gentle and romantic, plain and fresh) to describe the cheongsam;

[0127] 1.2 Collected 400 pictures of cheongsam, all with solid color background;

[0128] 1.3 Divide the pictures in step 1.2 into 5 groups at random, with 80 pictures in each group, select 20 subjects to participate, and each 4 subjects evaluate the style of a group of pictures with the adjectives in step 1.1. The pictures are represented by 1, 2, 3, and 4 respectively, where 1 represents "quaint and steady" clothing, 2 represents "noble and elegant" clothing, 3 represents "gentle and romantic" clothing, and 4 represents "plain and fresh" clothing. Select the cheongsam pictures that are evaluated together, and remove the pictures with ambiguous styles. Finally, 350 pictures and corresponding emoti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fusion feature-based cheongsam image emotional semantic recognition method. The method comprises the following steps of: extracting a color and texture features of a cheongsam image through selecting a representative adjective suitable for the cheongsam image; carrying out feature fusion; carrying out image emotional semantic learning; and carrying out emotional recognition on the image by using a trained emotional model. According to the method, the color and texture features are fused, a machine learning machine is used for establishing mapping from low-level features of the cheongsam image to emotional semantic meanings of high-level semantic meanings, a cheongsam image emotional semantic classification model is constructed, and emotional recognition for the cheongsam image is realized. The method is easy to realize and has relatively high emotional recognition correctness.

Description

technical field [0001] The invention relates to an emotion semantic recognition method of a cheongsam image based on fusion features, in particular to an emotion semantic recognition method of a cheongsam image which fuses color and texture features. Background technique [0002] People's perception and understanding of an image is at the semantic level and is subjective. Many studies have shown that different images can evoke different emotions in humans. However, the current image processing and applications mostly ignore the influence and role of emotion. For clothing images, any one or a piece of clothing can bring a certain emotional feeling to people, some clothing makes people feel noble, and some people feel fresh. At present, most of the researches on clothing images focus on object recognition, and there are relatively few studies on clothing images from the perspective of emotional semantics. However, there are very few studies on the emotional semantics of che...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/46G06K9/40G06K9/34G06T5/30
CPCG06T5/30G06T2207/20032G06V10/30G06V10/267G06V10/56G06F18/217G06F18/2411G06F18/253
Inventor 秦梓轩胡更生楼苏迪陈梅
Owner 杭州浙泉网络科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products