Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Optical character recognition method based on neural network

An optical character recognition and neural network technology, applied in the field of CRNN++ text recognition technology and text segmentation technology, can solve the problems of multiple follow-up processing, unsatisfactory accuracy of OCR text segmentation and recognition, and failure to meet the actual application requirements. Strong modeling, time-saving effect

Pending Publication Date: 2021-02-09
HANGZHOU NORMAL UNIVERSITY
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] To sum up, the problems existing in the existing technology are: the accuracy of OCR text segmentation and recognition for the current medical examination report scene is not ideal, and there are many follow-up processes, which cannot meet the actual application requirements.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Optical character recognition method based on neural network
  • Optical character recognition method based on neural network
  • Optical character recognition method based on neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0045] The present invention will be further described below in conjunction with drawings and embodiments.

[0046] Such as Figure 1-6 As shown, a neural network-based optical character recognition method, the specific implementation is as follows:

[0047] Step 1. Text area segmentation stage:

[0048] The input image is preprocessed by using the MorphNN-based network (MorphNN) to accurately mask the image of the text area.

[0049] Step 2, text recognition stage:

[0050] The specific stylistic content in the mask image of the text area is extracted by using the text recognition model based on CRNN++.

[0051] Further, the text region segmentation steps described in step 1 are as follows:

[0052] 2-1. Convert the scanned electronic medical report format into an image format, and further convert the converted image into a grayscale image;

[0053] 2-2. Input the grayscale image to the trainable morphological network (MorphNN), use the morphological network to simulate ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an optical character recognition method based on a neural network. The method comprises the following specific implementation steps: step 1, a text region segmentation stage: preprocessing an input image based on a morphological network to obtain a precise text region mask image; and step 2, a text recognition stage: extracting specific literary and sports contents in the mask image of the text region by utilizing a CRNN + +-based text recognition model. According to the method provided by the invention, the medical text in the experience report can be quickly extracted, the time for manually extracting the text information is greatly saved, the modularization is strong, the medical text information can be quickly and effectively segmented and extracted by using a data set of a small sample, and the method can be well generalized to a plurality of application scenes.

Description

technical field [0001] The invention relates to the field of character recognition, and specifically discloses a text segmentation technology based on a morphological network (MorphNN) and a text recognition technology based on CRNN++. A neural network-based optical character recognition method is provided. Background technique [0002] With the continuous improvement of material living standards, people pay more and more attention to their own health. According to relevant data from the National Bureau of Statistics, in 2018, about 70% of the working class in my country's big cities are in a sub-healthy state. The problem of population aging in my country is also very prominent. As of the end of 2018, my country's elderly population over the age of 60 was about 250 million. People's demand for health is increasing day by day, but at the same time, my country's medical resources are facing many problems, including resource shortage and uneven distribution. With the rapid ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/18G06K9/34G06K9/62G06N3/04G06N3/08G06V30/224
CPCG06N3/084G06V30/224G06V30/153G06V30/10G06N3/045G06F18/214
Inventor 袁浩刘复昌
Owner HANGZHOU NORMAL UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products