Pet image emotion recognition method based on deep residual network

An emotion recognition and pet technology, applied in the field of emotion recognition, can solve problems such as inaccurate pet emotion recognition and recognition

Pending Publication Date: 2021-03-19
HANGZHOU GEXIANG TECH CO LTD
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The invention provides a pet image emotion recognition method based on a deep residual network, which solves the problem of inaccurate recognition existin

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Pet image emotion recognition method based on deep residual network
  • Pet image emotion recognition method based on deep residual network
  • Pet image emotion recognition method based on deep residual network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] In order to enable those skilled in the art to better understand the solutions of the embodiments of the present invention, the embodiments of the present invention will be further described in detail below in conjunction with the drawings and implementations.

[0043] Aiming at the current problem that pets’ emotions cannot be accurately identified, and pet owners cannot better appease and interact with each other, this invention provides a pet image emotion recognition method based on a deep residual network, which acquires pet video and audio, and classifies them according to the audio data and emotional marking, establish a pet image emotion dataset of the target pet, construct a pet emotion classification algorithm model, and use the pet image emotion dataset as training data for model training, and perform target detection and emotion inference on the pet through the trained model . It solves the problem of inaccurate recognition in pet emotion recognition based o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a pet image emotion recognition method based on a deep residual network, and the method comprises the steps: obtaining the audio and video of a single pet under different emotions, carrying out the preprocessing according to the audio, and extracting the sound features; performing statistical classification and emotion marking on the audio according to the sound features, and extracting a corresponding single-frame picture according to a time sequence and the video to obtain a pet image emotion data set with emotion marks; constructing a pet emotion classification algorithm model, and taking the pet image emotion data set as training data to perform model training; and performing target detection and emotion deduction on the pet through the trained model. According to the method, the determinacy of pet emotion recognition can be improved, and the inference speed and precision of recognition are improved.

Description

technical field [0001] The invention relates to the technical field of emotion recognition, in particular to a pet image emotion recognition method based on a deep residual network. Background technique [0002] At present, there are more and more pets in the family, and the communication between people and pets is also increasing. However, people often cannot effectively recognize the emotions expressed by pets, causing communication barriers, causing pets to be out of control, and pets to bite passers-by. Case. Experienced pet owners can accurately judge the pet's emotions through the tone, volume, frequency, etc. of the pet's voice, so as to provide effective comfort methods, but this ability is not possessed by every pet owner. [0003] Existing recognition methods based on audio data mainly use cepstral coefficient formants and zero-crossing rates as characteristic parameters to establish statistical models such as Gaussian mixture models, or use unsupervised clusterin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G10L17/26G10L25/63G06N3/04
CPCG10L17/26G10L25/63G06V40/10G06V20/46G06N3/045
Inventor 郭祥谢衍涛宋娜王鼎陈继梅启鹏
Owner HANGZHOU GEXIANG TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products