Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Accent classification method based on deep neural network and model thereof

A technology of deep neural network and classification method, which is applied in the field of accent classification method and its model based on deep neural network, which can solve problems such as inaccurate learning process and difficulties in accent detection and recognition

Pending Publication Date: 2021-06-18
ANHUI UNIVERSITY
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But accents belong to the pronunciation habits of a group of speakers in the same area, so accent recognition is more challenging than speaker recognition in learning a group-level feature
Especially in the speech scene where more and more speakers tend to standard pronunciation, the detection and recognition of accents is becoming more and more difficult
In addition, because the accent recognition task training process is prone to overfitting phenomenon, which is often caused by inaccurate learning process

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Accent classification method based on deep neural network and model thereof
  • Accent classification method based on deep neural network and model thereof
  • Accent classification method based on deep neural network and model thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] The preferred embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings, so that the advantages and features of the present invention can be more easily understood by those skilled in the art, so as to define the protection scope of the present invention more clearly.

[0057] see figure 1 , the embodiment of the present invention includes:

[0058] A kind of accent classification method based on deep neural network, comprises the following steps:

[0059] S1: Extract the frame-level frequency domain features of the original audio, and construct a 2D speech spectrum as the network input X;

[0060] Regarding the preprocessing of the input spectrogram X, for a piece of speech signal, the common MFCC or FBANK frequency domain features in speech recognition tasks are extracted in each frame to construct a 2D spectrum, and then one dimension is expanded for CNN operation.

[0061] S2: Construct a multi-task weight...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an accent classification method based on a deep neural network. The method comprises the steps: extracting frame-level frequency domain features of an original audio, and constructing a 2D speech spectrum as a network input X; constructing a multi-task weight shared CRNNs-based front-end encoder to extract local sequence descriptors {P1,..., PT'} of the spectrum X; in the training process, additionally arranging a speech recognition task branch network behind the front-end encoder to inhibit an overfitting phenomenon in accent recognition; constructing a core branch network for an accent recognition task, and firstly, integrating all local sequence descriptors into a global accent feature; secondly, introducing a discriminative loss function in a prediction process; and finally, classifying the global accent features through a softmax-based classification layer to realize accent prediction. The invention also discloses a highly discriminatable accent classification model based on the deep neural network, and reliable accent prediction can be provided for speakers from different regional groups.

Description

technical field [0001] The invention relates to the field of deep learning, in particular to a deep neural network-based accent classification method and its model. Background technique [0002] Accent is the diversity of pronunciation behavior of speakers in a certain language. The different pronunciation methods can be attributed to the speaker's social environment, living area, etc. However, a wide variety of accents can challenge the generalizability of voice-controlled technologies, such as automatic speech recognition. Accent recognition technology can be used to solve accent-related problems or predict the speaker's regional identity, and has been extensively studied in recent years. With the development of deep learning technology, more and more image or voice recognition problems can be well solved by training artificial neural network models. Similar to face recognition, speaker recognition, etc., the core content of accent recognition tasks can be summarized as ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G10L15/00G10L15/02G10L15/06G10L15/08G10L15/16G10L15/30G06K9/62G06N3/04G06N3/08
CPCG10L15/005G10L15/02G10L15/063G10L15/08G10L15/16G10L15/30G06N3/04G06N3/08G06F18/2415Y02T10/40
Inventor 王伟吴小培张超吕钊张磊郭晓静高湘萍周蚌艳
Owner ANHUI UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products