Image processing method, image processing apparatus, and image forming apparatus

An image processing device and image processing technology, applied in the field of image processing devices and image forming devices, can solve the problems of small number of feature points, inability to fully ensure, and decrease in the accuracy of feature quantities, and achieve the effect of stable accuracy

Inactive Publication Date: 2008-12-17
SHARP KK
View PDF1 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] However, in this method, the number of centers of gravity calculated from the input manuscript is small, so the number of feature points may be small
In addition, if the feature quantity is calculated using fewer feature points, since the feature points required for calculation cannot be sufficiently secured, there is a problem that the accuracy of the calculated feature quantity itself decreases

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image processing method, image processing apparatus, and image forming apparatus
  • Image processing method, image processing apparatus, and image forming apparatus
  • Image processing method, image processing apparatus, and image forming apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0111] FIG. 12 is a schematic diagram showing the structure of the center-of-gravity calculation unit 2214 of the first embodiment. In Embodiment 1, the connected component threshold processing unit 2214b functions as the first connected component threshold processing unit 2214b1 and the second connected component threshold processing unit 2214b2, and the center-of-gravity calculation processing unit 2214c functions as the first center-of-gravity calculation processing unit 2214c1 and the second center-of-gravity calculation processing unit 2214c1. In the unit 2214c2, the center of gravity storage buffer 2214d serves as the first center of gravity storage buffer 2214d1 and the second center of gravity storage buffer 2214d2. The center of gravity calculation unit 2214 in other embodiments performs threshold judgment based on the first threshold (for example, 30, 100) different from the first connected component threshold and the second connected component threshold, performs cen...

Embodiment 2

[0114] FIG. 13 is an explanatory diagram showing an example of an input document image. Generally, there are various formats of manuscripts to be input, but there are few cases where the same characters exist in the entire manuscript, and there are many cases where no characters exist in the top, bottom, left, and right end regions of the manuscript. For example, as shown in FIG. 13 , when the lower limit of the threshold determination is similarly set to 100, since the noise is within the range of the threshold determination, it becomes the object of center of gravity calculation. Also, when the number of pixels of connected components of characters existing in the central region of the document image is 100 or less (for example, the number of pixels of point i), it is not subject to center of gravity calculation. As a result, there is a possibility that the accuracy of the calculated center of gravity may decrease.

[0115] Therefore, in the second embodiment, by dividing t...

Embodiment 3

[0134] Example 3

[0135] Figure 23 It is a flowchart showing the procedure of the processing of the feature amount calculating section 222 of the third embodiment. In the feature calculation unit 222 of the third embodiment, the pixel block setting unit 2220 sets a pixel block composed of one or more pixels of the image, and sets the mask length of the set pixel block (S501 ). The feature amount calculation unit 222 uses one of the feature points extracted by the pixel block setting unit 2220 as a feature point of interest, and sets the surrounding 8 pixel blocks centered on the pixel block containing the feature point of interest as the “peripheral area”. ”, output the data including the set pixel block, mask length and surrounding area to the nearby point extraction unit 2221.

[0136] In feature quantity calculation section 222, nearby point extraction section 2221 receives the data output from pixel block setting section 2220, and extracts nearby feature points within...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The number of pixels in an identified pixel region is counted, a feature point of the pixel region is extracted and the number of the feature points is counted when the number of the pixels counted has been determined to be equal to or higher than a first threshold value, whether the counted number of the feature points is equal to or lower than a second threshold value is determined, features is calculated based on the feature point extracted from the pixel region when the number of the feature points has been determined to be above the second threshold value, and the first threshold value is changed when the number of the feature points has been determined to be equal to or lower than the second threshold value. Image similarity determination process can be stably performed without any degradation in determination accuracy.

Description

technical field [0001] The present invention relates to an image processing method, an image processing device and an image forming device for determining similarity of images, and particularly to an image processing method, an image processing device and an image forming device capable of improving the accuracy of similarity determination of images with few feature points. Background technique [0002] Conventionally, input image data obtained by scanning a document image by a scanner is compared with a registered image registered in advance to determine the degree of similarity between the two, and the processing of the input image data is controlled based on the determination result (for example, copying, sending, editing, etc.). [0003] As a method of judging the degree of similarity, for example, a method of extracting a keyword (key word) from an image by OCR (Optical Character Reader; optical character reader) and performing matching (matching) by the keyword, or lim...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/52
Inventor 芳野大树
Owner SHARP KK
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products