Character recognition method based on deep learning

A technology of character recognition and deep learning, applied in the field of character recognition based on deep learning, can solve the problems that the importance of feature points cannot be given a unified standard, the shape of Chinese characters that cannot be dealt with, and the accuracy of character recognition is low

Inactive Publication Date: 2017-10-20
HUAZHONG UNIV OF SCI & TECH
View PDF4 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] However, there are deficiencies in traditional Chinese font recognition methods. Because of the complexity of Chinese characters, feature extraction methods cannot deal with changing Chinese character shapes. Feature point extraction methods require manual experts to define important feature point positions. Moreover, for those feature points The importance cannot be given a unified standard, resulting in a lower accuracy of text recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Character recognition method based on deep learning
  • Character recognition method based on deep learning
  • Character recognition method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0030] A character recognition method based on deep learning disclosed in the present invention designs a deep spatial transformation convolutional neural network, which can actively perform various spatial transformations on input character images, thereby achieving the purpose of data enhancement and improving the network at the same time. The ability of space invariance has a relatively high ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a character recognition method based on deep learning. The method includes the construction phase of a spatial transform layer and the construction and training phase of a deep convolutional neural network. The spatial transform layer consists of three parts: a positioning network receives a feature map as an input, enables the feature map to pass through a series of hidden layers, and then outputs a parameter of spatial transform, wherein the parameter is to be used for the feature map; a grid generator generates sampling grids by using the parameter generated in the first part; and a sampler takes the feature map and the sampling grids as inputs, samples the feature map at grid points, and finally obtains an output feature map result. The spatial transform layer can be differentiated. By the spatial transform layer, image data can be spatially processed in the network so that the network can learn the invariance of the space distortion and avoid the need for manually generating a large number of deformed samples in traditional convolutional network training. In addition, by building deeper volumes and neural networks, a better recognition effect is realized on a large variety of Chinese characters.

Description

technical field [0001] The invention belongs to the field of character recognition in pattern recognition, and more specifically, relates to a method of character recognition based on deep learning. Background technique [0002] With the continuous development of modern science and technology and the widespread popularization of the Internet, we have to come into contact with massive information resources presented in various forms every day, especially in our daily life, study and work, it is often unavoidable to deal with a large number of Text information and enter it into the computer. Therefore, how to quickly and accurately input these text information into various electronic devices such as computers has become an urgent problem to be solved. Optical Character Recognition (OCR for short) refers to a technology that automatically extracts the text in the picture with the help of machine equipment and converts it into text that can be edited by the machine. [0003] G...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/32G06K9/62G06N3/02
CPCG06N3/02G06V20/62G06V30/287G06F18/214
Inventor 凌贺飞赵航李平
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products