Method and system for controlling emotion lamp based on expressions

An emotion and expression technology, applied in the field of artificial intelligence, can solve the problems of single reference object and not applicable to all groups, and achieve the effect of fewer training parameters, excellent classification performance, and reduced calculation amount

Pending Publication Date: 2020-05-22
武汉美和易思数字科技有限公司
View PDF3 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The existing emotional light adjustment scheme has a single reference object and is not suitable for all groups. Therefore, in order to solve the above problems, the present invention provides a method and system for controlling emotional lights based on expressions, which can be adjusted according to different ages and actions. Light brightness and tone for mood lights

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for controlling emotion lamp based on expressions
  • Method and system for controlling emotion lamp based on expressions

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] Such as figure 1 As shown, a method and system for controlling emotional lights based on expressions of the present invention includes the following steps:

[0042] S1. Collect the user's face image in real time, preprocess the collected face image, and obtain facial features, facial contour features, and directional gradient histogram features;

[0043] Further preferably, the preprocessing includes: grayscale processing, image cropping and data increment processing.

[0044] Among them, the gray-scale processing is specifically: converting the input image into a single-channel gray-scale image without using techniques such as intensity normalization.

[0045] Image cropping is specifically: using OpenCV's Haar feature cascade classifier to detect and cut human faces. The Haar feature cascade classifier returns face position information in the picture. After obtaining the face position information, all people in the picture can be The face image is extracted, and the extracted...

Embodiment 2

[0070] Such as figure 2 As shown, this embodiment provides a system for controlling emotional lights based on expressions on the basis of embodiment 1, which includes a central processor, a data acquisition unit, a face recognition unit, an expression analysis unit, a light control unit, and an infrared sensing unit , Data storage unit and music control unit.

[0071] The data collection unit collects real-time dynamic pictures of the room and voice command information, transmits the real-time dynamic pictures to the face recognition unit, and sends the voice command information to the central processor. In this embodiment, the data collection unit includes an image collection unit and a voice collection unit; the image collection unit collects screen images through a high-definition camera, infrared lighting components or high-fidelity MIC head, and transmits the screen images to the face recognition unit; voice collection The unit collects the user's voice instruction informa...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method and a system for controlling an emotion lamp based on expressions. A local anthropometry model is acquired, facial five-sense-organ features, facial contour features and direction gradient histogram features are inputted into the anthropometry model to obtain age classifications corresponding to facial graphs, and a music adjustment scheme can be adjusted accordingto ages so that music played by the emotion lamp is closer to the real emotion of the user; the facial feature, the facial contour feature and the direction gradient histogram feature are inputted into a convolutional neural network, and the convolutional neural network has the advantages of simplifying the complexity of an expression analysis algorithm and improving the accuracy of expression analysis; the convolutional neural network is provided with five continuous convolutional layers and three full-connection layers, and the number of neurons of the three full-connection layers is set tobe 1024 so that the structure is the optimal structure with excellent classification performance, short execution time and few training parameters.

Description

Technical field [0001] The invention relates to the field of artificial intelligence, in particular to a method and a system for controlling emotional lights based on expressions. Background technique [0002] In recent years, with the development of communication technology, smart technology, and mobile Internet technology, smart home has become an inevitable direction for the development of homes and houses in the future. At the same time, home service-type emotional lights will become an important device among them. Most of the existing emotional lights acquire emotional states based on facial expressions, acquire environmental parameters based on environmental detection devices, and establish lighting adjustment schemes based on emotional states and environmental parameters; or adjust lights based on the correspondence between behaviors and emotions. The existing lighting adjustment scheme for emotional lights has a single reference object and is not suitable for all groups. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06F16/635G06N3/04G06N3/08
CPCG06F16/636G06N3/08G06V40/174G06N3/045G06F18/241
Inventor 海克洪王迎曙
Owner 武汉美和易思数字科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products