Text detection method in natural scene based on deep learning

A natural scene and deep learning technology, applied in the field of text detection in natural scenes based on deep learning, can solve problems such as small aspect ratio, increased detection, difficulty, etc., to achieve the effect of improving robustness and accuracy, and ensuring correct rate

Inactive Publication Date: 2020-12-11
SHANGHAI MARITIME UNIVERSITY
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are many existing text detection algorithms based on deep learning. When performing text detection on complex background images, the text will be detected as a target object, because the aspect ratio (also called aspect ratio, w / h) of the text detection object The change is large, unlike the relatively small change in the aspect ratio of the object. In addition, because the text in the natural scene is affected by factors such as size and position, it will add a lot of difficulties to the detection

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Text detection method in natural scene based on deep learning
  • Text detection method in natural scene based on deep learning
  • Text detection method in natural scene based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] Embodiments of the present invention are described below through specific examples, and those skilled in the art can easily understand other advantages and effects of the present invention from the content disclosed in this specification. The present invention can also be implemented or applied through other different specific implementation modes, and various modifications or changes can be made to the details in this specification based on different viewpoints and applications without departing from the spirit of the present invention.

[0050] see figure 1 , for the present invention preferably a kind of embodiment of the natural scene scene text detection method based on deep learning, this method comprises the following steps:

[0051] Step 1: Use standard data sets ICDAR2013, ICDAR2015, and MSRA-TD500 with a total of M pictures as text image data sets, where M is 5,000 to 10,000 pictures; and 70% of the total number of data sets M, a total of m1 pictures as a trai...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a text detection method in a natural scene based on deep learning, which belongs to the field of computer vision. The text detection method is specifically composed of a starting module, a character area recognition network and a capsule screening and classifying network. The method comprisesthe steps of firstly, enabling a starting module to reduce the parameters of a network, and placing convolution kernels of different sizes in the same convolution layer, so that the adaptability of the network to character features of different scales is enhanced; then detecting single characters in the picture by using a character region recognition network, and screening and classifying the detected single characters by using a capsule classification network so as to judge whether the detected single characters are characters; and finally, splicing the single characters into a text line through a character splicing method based on multi-feature map fusion and hole convolution. Compared with the prior art, the influence of factors such as the direction, the size and the position of the characters on the detection result can be reduced, so that the correct rate of character detection is ensured, and the robustness and the accuracy are further improved.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a deep learning-based text detection method in natural scenes. Background technique [0002] With the continuous development and popularization of computer technology and smart devices, there are many image data in complex scenes, such as street view advertising pictures, handwritten forms, etc. They are distributed in every corner of the Internet, and the information is very rich and valuable. Therefore, it has become an unavoidable demand to use computers to process the increasing variety of digital image data. How to extract the required and useful information from these images is becoming more and more important, which is also a hot research field in the field of computer vision. [0003] In these digital images, text information is the most direct information. Understanding text information in natural scene images has many practical significance for human-computer ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/34G06N3/04
CPCG06V30/153G06V30/10G06N3/045G06F18/253G06F18/214
Inventor 刘晋王恒阳
Owner SHANGHAI MARITIME UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products