Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

End-to-end scene character detection and recognition method and system

A text detection and recognition method technology, applied in character recognition, character and pattern recognition, instruments, etc., can solve problems that are not conducive to improving system speed and accuracy, detection and recognition cannot promote each other, and increase system complexity. Improve the recognition accuracy, shorten the time, and improve the effect of rapidity

Active Publication Date: 2019-10-22
SHANDONG UNIV
View PDF5 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This algorithm separates the position detection of the text from the content recognition of the text, and there is a time difference and a difference in order. At the same time, since the text detection algorithm and the text recognition algorithm need to use the convolutional neural network to extract the feature map, a lot of time is wasted. At the same time, text detection and text recognition are regarded as two unrelated tasks, so that detection and recognition cannot promote each other, which will increase the complexity of the system and is not conducive to improving the speed and accuracy of the entire system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • End-to-end scene character detection and recognition method and system
  • End-to-end scene character detection and recognition method and system
  • End-to-end scene character detection and recognition method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] The present disclosure will be further described below in conjunction with the accompanying drawings and embodiments.

[0039] It should be noted that the following detailed description is exemplary and intended to provide further explanation of the present disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs.

[0040] It should be noted that the terminology used herein is only for describing specific embodiments, and is not intended to limit the exemplary embodiments according to the present disclosure. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combinations thereof.

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an end-to-end tendency scene character detection and recognition method and an end-to-end tendency scene character detection and recognition system. The method comprises the steps: collecting pictures marked with object types, constructing a training data set and a test data set, and carrying out the preprocessing of the data sets for training a feature extraction network; collecting pictures marked with character positions, constructing a training data set and a test data set, and preprocessing the data sets to train a network of a character detection part; pictures marked with character positions and character contents at the same time are collected, a training data set and a test data set are constructed, and corresponding picture preprocessing is carried out to train a neural network of a character recognition part; inputting the picture into a convolutional neural network; extracting a shared convolution feature map through a feature extraction network; andmapping predicted character position coordinates to the shared convolution feature map by the character detection part maps by using the shared convolution feature map, cutting off a feature map blockcorresponding to the character part in the image and converting the feature map block into a feature sequence; finally decoding the feature sequence into a readable character sequence. Since the convolution characteristic spectrum is only calculated once, the intermediate redundancy process is avoided. The speed of the whole scene character detection and recognition system is improved.

Description

technical field [0001] The disclosure belongs to the technical field of text detection and recognition, and in particular relates to an end-to-end scene text detection and recognition method and system. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and do not necessarily constitute prior art. [0003] As the greatest invention of mankind, text is one of the main ways for people to transmit and interact with information. Pictures are one of the main carriers of text, so reading text from pictures has important practical value. [0004] As far as the inventor understands, the traditional processing method generally detects the position of the text on the original input image by a text detection algorithm earlier, and circles the text with a text box; then utilize methods such as Opencv to cut out the area where the text is located from the picture; Finally, a text recognition algorithm is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06V30/10G06N3/045G06F18/214
Inventor 崔鹏宋振张焕水
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products