Method and system for achieving short-distance character extraction of mobile terminal

A mobile terminal, text extraction technology, applied in neural learning methods, character recognition, character and pattern recognition and other directions, can solve the problem of inability to take into account the efficiency and accuracy of text information extraction, achieve a good user experience, expand the image range, The effect of improving efficiency and accuracy

Pending Publication Date: 2021-05-28
北京三缘聚科技有限公司
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to address the deficiencies of the prior art and provide a method and system for short-distance text extraction by mobile terminals, aiming to solve the problem that the existing text extraction methods cannot take into account the efficiency and accuracy of text information extraction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for achieving short-distance character extraction of mobile terminal
  • Method and system for achieving short-distance character extraction of mobile terminal
  • Method and system for achieving short-distance character extraction of mobile terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0114] Such as figure 2 As shown, a method for realizing short-distance text extraction by a mobile terminal includes:

[0115] Step S100, specifically:

[0116] Turn on the camera, turn on the auto-focus function of the mobile terminal, and check whether the image captured by the current camera is in focus. If the focus is successful, control the camera to automatically take a clear image. If the focus is not detected, re-focus until a clear image is captured. , and display the acquired image on the screen;

[0117] Using the inertial measurement unit to acquire the attitude data of the current mobile terminal while acquiring the image;

[0118] Step S200, specifically:

[0119] Utilize the inertial measurement unit to obtain the rotation vector r of the current mobile terminal and the acceleration a and / or angular velocity w of the current mobile device in real time, according to the acquired rotation vector r of the current mobile device and the acceleration a and / or an...

Embodiment 2

[0125] Such as image 3 As shown, a method for realizing short-distance text extraction by a mobile terminal includes:

[0126] Step 100, specifically:

[0127] Turn on the camera, turn on the auto-focus function of the mobile terminal, and check whether the image captured by the current camera is in focus. If the focus is successful, control the camera to automatically take a clear image. If the focus is not detected, re-focus until a clear image is captured. ;

[0128] Taking a first image with clear focus, using the inertial measurement unit to obtain the attitude data of the current mobile terminal while taking the first image, and displaying the first image on the screen;

[0129] When the inertial measurement unit senses that the mobile terminal is moving, the camera is controlled to collect a second image adjacent to the first image within the image collection period, wherein the first image and the second image have an overlapping portion;

[0130] Step S200, specif...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a method and system for achieving short-distance text extraction of a mobile terminal, and the method comprises the steps: turning on a camera, obtaining an image containing text content to be recognized by a user, and displaying the obtained image on a screen; acquiring motion information of the mobile terminal in real time by using an inertial measurement unit, performing geometric transformation on an image displayed on the screen according to the motion information of the mobile terminal, and displaying the image subjected to geometric transformation on the screen in real time; and extracting character information falling into the to-be-recognized area on the screen in the displayed image. According to the invention, the image shot in advance is matched with the motion information of the current mobile terminal to simulate the real-time picture of the camera, so that the image displayed to a user by a screen is not blurred due to out-of-focus caused by the too close distance between the camera and the shot object; a user only needs to control the mobile terminal with one hand to specify the character part needing to be extracted, continuous character extraction can be achieved, and the efficiency and accuracy of character extraction are effectively improved.

Description

technical field [0001] The invention belongs to the technical field of image processing and text extraction, and relates to a method and system for short-distance text extraction by a mobile terminal. Background technique [0002] With the widespread popularization of smart mobile devices, text recognition technology is increasingly applied to smart mobile devices with image scanning or camera functions, so that smart mobile terminals can also realize text recognition. [0003] In the previous text extraction method, the page with the text to be recognized is displayed on the display screen through the camera, and the user needs to move the cursor on the screen to locate the character area to be recognized by the cursor, and then extract the text from the character area to be recognized. However, this method requires the user to hold the smart mobile terminal with one hand and move the cursor with the other hand for positioning, which makes the operation cumbersome and the t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/32G06K9/34G06K9/46G06N3/04G06N3/08
CPCG06N3/04G06N3/08G06V20/62G06V30/153G06V10/44G06V10/462G06V30/10
Inventor 宗毅段志超任柏成
Owner 北京三缘聚科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products