Image quality determination method and device, electronic equipment and medium

An image quality and image technology, applied in the field of image processing, can solve problems affecting the conversion rate of advertisements, etc.

Pending Publication Date: 2021-06-18
BEIJING BAIDU NETCOM SCI & TECH CO LTD
0 Cites 0 Cited by

AI-Extracted Technical Summary

Problems solved by technology

It is difficult for users to click on images with poor qualit...
View more

Method used

By according to the texture energy value of each pixel in the image to be evaluated, the number of pixels in the vertical direction of the image to be evaluated and the number of pixels in the horizontal direction of the image to be evaluated, determine the texture energy threshold of the image to be evaluated, realize In order to adaptively obtain the texture energy threshold according to the relevant information of the image to be evaluated, no technical personnel need to manually set it, and the efficiency is improved.
By determining the area ratio between the area of ​​the foreground area and the image area to be evaluated, and the number of background pixels belonging to each color category, respectively with the quantity ratio between the image pixel number to be evaluated, and according to the area ratio and the quantity ratio to determine the image quality of the image to be evaluated, and realize the joint determination of the image quality of the image to be evaluated according to the area ratio associated with the foreground area and the quantity ratio associated with the background pixel, which improves the accuracy of the final image quality determination result reliability and credibility.
The present disclosure determines the texture energy threshold of the image to be evaluated according to the texture energy value of each pixel in the image to be evaluated, and determines the texture energy threshold of the image to be evaluated according to the texture energy value of each pixel in the image to be evaluated and the texture energy threshold The foreground area of ​​the image, and then determine the background area of ​​the image to be evaluated according to the foreground area, realize the effect of quickly determining the background area of ​​the image to be evaluated based on the texture energy value of each pixel point, and the adaptive texture energy threshold, reducing the need for determination The workload required for the background area of ​​the image to be evaluated improves the efficiency of determining the background area; by matching the color space value of any background pix...
View more

Abstract

The invention discloses an image quality determination method and device, electronic equipment and a medium, and relates to the technical field of image processing, in particular to the technical field of image quality evaluation and computer vision. According to the specific implementation scheme, the method comprises the steps of determining a background region of a to-be-evaluated image according to a texture energy value of each pixel point in the to-be-evaluated image; determining the color category of each background pixel point in the background area; and determining the image quality of the to-be-evaluated image according to the number of the background pixel points belonging to each color category. The effect of reducing the workload required for determining the background area of the to-be-evaluated image is achieved, the efficiency of determining the background area is improved, and then the efficiency of determining the image quality is improved.

Application Domain

Image enhancementImage analysis +1

Technology Topic

EngineeringComputer graphics (images) +7

Image

  • Image quality determination method and device, electronic equipment and medium
  • Image quality determination method and device, electronic equipment and medium
  • Image quality determination method and device, electronic equipment and medium

Examples

  • Experimental program(1)

Example Embodiment

[0026] The exemplary embodiments of the present disclosure will be described below, including various details of the embodiments of the present disclosure to facilitate understanding, and they should be considered simply exemplary. Accordingly, it will be appreciated by those skilled in the art that various changes and modifications can be made to the embodiments described herein without departing from the scope and spirit of the present disclosure. Similarly, a description of the well-known functions and structures is omitted in order to clear and concise, the following description is omitted.
[0027] Applicants found in the development process, prior art is detected for image background color monotonic issues, first obtaining the background area of ​​the image to be detected by the technique of the target detection or segmentation, and then determines whether the color is monotonous based on the acquired background area.
[0028] However, target detection techniques need to predefine the categories of objects to be detected, and a wide variety of object categories can lead to problems with data difficult to collect, annotated, and difficult to train tasks, so that it is determined that the amount of work required to evaluate the image is large. Target segmentation technology also needs to predefine the category of the object to be divided, which will also lead to problems with data difficult to collect, annotated, and difficult to train tasks, making it a large amount of work required to be evaluated image background.
[0029] It can be seen that the detection of the above two types of decision background regional methods can be seen that the detection efficiency is low.
[0030] figure 1 The flow chart of the method of determining an image quality disclosed in the present disclosure, the present embodiment can be applied to determining the case where the image quality of the image to be evaluated. This embodiment can be performed by a determining device of image quality, and the device can be implemented using software and / or hardware, and can be integrated on arbitrary computing electronic devices.
[0031] Such as figure 1 As shown, the method of determining the image quality disclosed in the present embodiment can include:
[0032] S101, determine the background area of ​​the image to be evaluated based on the texture energy value of each pixel point in the image to be evaluated.
[0033] Wherein, the image to be evaluated can be an independent image, or any of the video frame images in a video. The texture energy value is a texture feature that reflects the uniformity of the image grayscale distribution and the thickness of the texture, and is usually understood to detect the texture characteristics of the image foreground region, but since the background region is mostly solid color, the texture of the background area The characteristics are deletion.
[0034] In one embodiment, the horizontal gradient and vertical gradient of each pixel point in the image to be evaluated according to the existing gradient calculation method are first determined. This embodiment is based on a gradient template-based method, by determining the horizontal gradient and vertical gradient of each pixel point in the image to be evaluated by the following formula:
[0035] G x = X * i
[0036] G y = Y * I
[0037] Among them, G x Indicates the horizontal gradient of each pixel point in the image to be evaluated. y Indicates the vertical gradient of each pixel point in the image to be evaluated, i represents the gradation value of each pixel point in the image to be evaluated, and X represents the horizontal gradient template, Y represents the vertical gradient template. Optionally, horizontal gradient template X and vertical gradient template Y can be selected:
[0038]
[0039] Next, after obtaining the horizontal gradient and vertical gradient of each pixel point in the image to be evaluated, the texture energy value of the pixel point is calculated according to the horizontal gradient and vertical gradient of any pixel point, and the specific calculation process can be represented by the following formula:
[0040] E (i, j) = | g x (i, j) | + | g y (i, j) |
[0041] Where E (i, j) represents the texture energy value of the pixel point of the pixel point of the No. j column in the image to be evaluated. x (i, j) indicate the horizontal gradient of the pixel point of the first column in the image to be evaluated. y (i, j) represents the vertical gradient of the pixel point of the pixel point in the line J column in the image to be evaluated.
[0042] Finally, due to the background pixel point in the background area, and the foreground pixel points in the foreground area, there is a large difference in the texture energy distribution, so according to the determined size of the texture energy value of each pixel point in the image to be evaluated, The method of determining the threshold determination determines that the background area is smaller than the area of ​​the pixel point of the preset threshold, as the background area of ​​the image to be evaluated.
[0043] It is worth explanating that the present embodiment is only used as an example of a gradient template, and explain how the horizontal gradient and vertical gradient of each pixel point are explained, and the specific method is not limited, and all pixels can be determined. The method of horizontal gradient and vertical gradient should be in the scope of protection of the present invention.
[0044] By determining the background area of ​​the image to be evaluated according to the texture energy value of each pixel point to be evaluated, the background area of ​​each pixel point is realized, and the effect of the background area of ​​the image to be evaluated is realized.
[0045] S102 determines the color category of each background pixel point in the background area.
[0046] Among them, the background pixel point is the pixel point in the background area. Background pixel point color category represents background pixel points color, such as white background pixel point, black background pixel point or red background pixel point, etc.
[0047] In one embodiment, the background pixel points in the background area are traversed, acquire the color space values ​​of each background pixel point, in which the color space value of the background pixel point represents the background pixel point in the preset color space, preset Color space include, but is not limited to, RGB color space, CMY color space, HSV color space, and HSI color space, etc. The color space value of any background pixel point is matched to each of the candidate color space values, and the color category corresponding to the successful candidate color space value is used as the color category of the background pixel point. For example, the RGB color space value of any background pixel point is (255, 255, 255), and the color category corresponding to the candidate color space value of the RGB color space value (255, 255, 255) is "white", the color category of the background pixel point is determined. For "white".
[0048] By determining the color category of each background pixel point in the background area, the image quality of the image to be evaluated is laid the foundation.
[0049] S103 determines the image quality of the image to be evaluated based on the number of background pixel points belonging to each color class.
[0050] Among them, the image quality of the image to be evaluated includes both quality abnormalities and normal quality.
[0051] In one embodiment, it is determined that the number of background pixel points belonging to each color category in the image to be evaluated, and the number of background pixel points of each color category is compared to the quantity threshold, and if any color category background pixel point The quantity is greater than the number of thresholds, it is determined that the image quality of the image to be evaluated is an abnormality, that is, the problem is to be evaluated, there is a problem that the base color is monotonous. For example, the white background pixel point is 1000, and the number threshold is 9500, it is determined that the image quality of the image to be evaluated is a quality exception.
[0052] In another embodiment, it is determined that the total number of pixel points contained in the image to be evaluated, and the number of background pixel points belonging to each color category in the image to be evaluated, and the number of background pixel points belonging to each color category will, respectively The total number of pixel points contained in the image to be evaluated performs a ratio calculation, resulting in the number ratio of the number of background pixel points of each color class and the total number of pixel points. The comparison of each number ratio is compared to the number ratio threshold. If the corresponding numerical ratio of any color class is greater than the number ratio threshold, it is determined that the image quality of the image to be evaluated as the quality exception, the problem to be evaluated there is a problem. For example, the value ratio of the white corresponding value is 66.4%, and the number ratio threshold is 65%, it is determined that the image quality of the image to be evaluated is the quality exception.
[0053] By determining the image quality of the image to be evaluated by determining the image quality of the image to be evaluated by determining the image quality of the image to be evaluated, based on the number of background pixel points to be evaluated.
[0054] The present disclosure determines the background area of ​​the image to be evaluated according to the texture energy value of each pixel point to be evaluated, and determines the color category of each background pixel point in the background area, and then according to the number of background pixel points belonging to each color class, The image quality of the image to be evaluated is determined that the present disclosure does not need to define the category to be detected or the category to be divided, and there is no need to perform data set collection, data set labeling and data set training, thereby achieving great reduction determination Evaluating the effect of the workload required by the image background area, improved the efficiency of the design area, indirectly improved the efficiency of determining image quality, and reduced the time required to determine the image quality.
[0055] On the basis of the above embodiment, S103, including:
[0056] If the image is to be evaluated as any of the frame images in any of the video, the video quality of the video is determined according to the image quality of all video frame images in the video.
[0057] In one embodiment, it is determined that the number of video frame images of the image quality in the video is determined, and the ratio of the number of video frame images of the quality abnormality is between the number of video frame images of the video frame image, if the ratio is greater than the preset The ratio is determined to determine the video quality of the video as an exception.
[0058] The video quality of the video is determined according to the image quality of all video frame images in the video, and the determination of the video quality is determined, and the method is enlarged. Application range.
[0059] Applicants found in the development process: 1) Based on the theoretical single color space value, the color category of human eye visual presence is difficult, and the color category of human eye is not a set color space value, but a color spatial value range. 2) Simply determine the image quality of the image to be evaluated based on the number of background pixel points belonging to each color class, the result is not very accurate, such as some images there is a single-tone problem, but its screen layout is reasonable, visual effect Still very shocking.
[0060] Therefore, the present disclosure has improved the above two problems.
[0061] Figure 2A According to a flow chart of a method of determining an image quality disclosed in the present disclosure, based on the above technical solution, it is further optimized and expanded, and combined with the respective alternative embodiments described above.
[0062] S201, according to the texture energy value of each pixel point in the image to be evaluated, the texture energy threshold to be evaluated is determined.
[0063] Wherein, the texture energy threshold in this embodiment is an adaptive threshold, which can automatically derive the texture energy value of each pixel point in any of the images to be evaluated, and no related art will set the texture for each to evaluate the image according to the experience. Energy threshold.
[0064] Optional, S201 includes:
[0065] According to the texture energy value of each pixel point in the image to be evaluated, the number of pixel points to evaluate the image vertical direction and the number of pixels to evaluate the image horizontal direction, determining the texture of the image to be evaluated Energy threshold.
[0066] Wherein, the number of pixel points to be evaluated image vertical direction indicates that the number of pixels in the vertical direction of the image is evaluated. The number of pixels to be evaluated in the image level direction indicates that the number of pixel points to be evaluated in the image level direction.
[0067] In one embodiment, the texture energy value of each pixel point in the image to be evaluated is added to obtain a total texture energy value, and the number of pixels to be evaluated in the vertical direction and the horizontal direction is multiplied by multiplication. As a result, the texture energy threshold of the image to be evaluated is determined according to the ratio between the total value of the texture energy and the multiplier results.
[0068] By based on the texture energy value of each pixel point in the image, the number of pixel points to be evaluated in the image vertical direction and the number of pixel points to be evaluated, the texture energy threshold to be evaluated, and it is realized The relevant information of the evaluation image adaptives to obtain the texture energy threshold, no need to manually set up, improve efficiency.
[0069] Optionally, "according to the texture energy value of each pixel point in the image to be evaluated, the number of pixels to evaluate the image vertical direction and the number of pixels to evaluate the image level direction, determined the number. The texture energy threshold of the image to be evaluated, including:
[0070] The texture energy threshold to be evaluated image is determined by the following formula:
[0071]
[0072] Among them, ThRES indicates the texture energy threshold of the image to be evaluated; S represents the scaling coefficient, and optionally set to 10; h indicates that the number of pixels in the vertical direction of the image to be evaluated, W indicates the level of the image to be evaluated. The number of pixels in the direction, i (i, j) showing the texture energy value of the first row pixel point in the image to be evaluated.
[0073] By formula The texture energy threshold to be evaluated is calculated, and a specific implementation method capable of determining the texture energy threshold of the image to be evaluated.
[0074] S202, according to the texture energy value of each pixel point in the image to be evaluated, the foreground area of ​​the image to be evaluated, and the background area of ​​the image to be evaluated is determined according to the foreground area.
[0075] In one embodiment, the texture energy value of each pixel point in the image is evaluated, respectively compare the texture energy threshold, respectively, and determines the foreground area of ​​the image to be evaluated according to the alignment result, and then according to the determined foreground area, The area other than the foreground area is to be evaluated as the background area of ​​the image to be evaluated.
[0076] Optionally, S202 is "according to the texture energy value of each pixel point in the image to be evaluated, and the foreground area of ​​the image to evaluate the image", including the following A and B, including the following:
[0077] A. Set the gradation value of the texture energy value greater than or equal to the pixel point of the texture energy threshold to the first gradation value, and set the gradation value of the texture energy value of less than the pixel point of the texture energy threshold to the first Different gradation values ​​to obtain a binarized image of the image to be evaluated; wherein the first gray value is different from the second gray value.
[0078] Among them, there is only two gradation values ​​of the pixel point in the binarized image, and the image is present.
[0079] In one embodiment, the texture energy value of each pixel point in the image is evaluated, respectively compare the texture energy threshold, and the gradation value of the pixel point of the texture energy value is greater than or equal to the texture energy threshold is set to the first. The gradation value, the optional first gradation value is "255", and the gradation value of the pixel point smaller than the texture energy threshold is set to the second gradation value, and the optional first gradation value is "0", thereby generating a binaryized image corresponding to the image.
[0080] B, determine the foreground area of ​​the image to be evaluated based on the bisarrative image of the image to be evaluated.
[0081] In one embodiment, by an edge detection technique, the two-value image of the obtained image to be evaluated, determines the contour of the foreground area in the binarized image, wherein the edge detection techniques include, but are not limited to Sobel operators. Detection method, Canny operator detection method, and Laplacian operator detection method. Depending on the relative position coordinates of the foreground region in the binarized image, the foreground region contour is determined to be the relative position coordinates of the image to be evaluated, and thereby consisting of the pixel point in the foreground region contour, as the foreground area of ​​the image to be evaluated. After obtaining the foreground area of ​​the image to be evaluated, the area of ​​the pixel point except the image outside the foreground area is obtained as the background area of ​​the image to be evaluated.
[0082] Figure 2b It is a schematic diagram of obtaining a background area according to an embodiment of the present disclosure, such as Figure 2b As shown in, 200 denotes a binarized image to be evaluated, 201 represents the binarized image of the image 200 to be evaluated, and 202 represents the foreground area in the binarized image 201, and 203 represents the background area in the image 200 to be evaluated, 204 means to be evaluated. The foreground area in the image 200. 20
[0083] By setting the gradation value of the texture energy value greater than or equal to the pixel point of the texture energy threshold to the first gradation value, and the gradation value of the pixel point smaller than the texture energy value is set to the second gray value, The binarized image of the image to be evaluated, and determines the foreground area of ​​the image to be evaluated according to the binarized image to be evaluated, the effect of the texture energy value determines the two-value image to be evaluated, and according to two The value of the image can be more accurately determined to determine the foreground area of ​​the image to be evaluated, and indirectly make the background area of ​​the image to be evaluated, and more accurate, further ensuring that the determination result of the image quality of the image to be evaluated is also more accurate.
[0084] S203, match the color space value of any of the background zones in the background area with the color space value section corresponding to each color category, and the color category belonging to the successful color space value section, as the color of the background pixel point category.
[0085]In one embodiment, the HSV color space for acquiring color information is selected as the color space in the present embodiment. Ten common human eye visual color categories are set in advance, and each color category corresponding to the HSV color space range, optionally, the HSV color space value interval is defined as follows: Black (0, 0, 0) ~ (180, 255, 46), gray (0, 0, 46) ~ (180, 43, 220), white (0, 0, 221) ~ (180, 30, 255), red (156, 43, 46) ~ (180, 255, 255 ), (0, 43, 46), orange (11, 43, 46) ~ (25, 255, 255), yellow (26, 43, 46), green (35, 43, 46) ~ (35, 43, 46) ~ 77, 255, 255), blue (78, 43, 46), blue (100, 43, 46) ~ (124, 255, 255) and purple (125, 43, 46) ~ (155, 255, 255).
[0086] In the background area, the HSV color space value of any of the HSV color space values ​​in the background area, respectively correspond to the H-value interval, the S-value interval, and V value intervals in the HSV color space area, respectively, respectively. Match, match the H value, S value, and V values ​​match the color category to which the HSV color spatial value section belongs, as the color category of the background pixel point.
[0087] For example, an HSV color space value of a background pixel point is (5, 200, 30), that is, the HSV color space value, the h value: "5", the S value: "200", V value: "30". The black corresponding HSV color spatial value interval is (0, 0, 0) ~ (180, 255, 46), that is, the H-Value Interval: (0 ~ 180), the S-Value Interval: (0 ~ 255), V Value Interval: (0 ~ 46), the HSV color space value of the HSV color space value of the background pixel point belongs to the HSV color space value of (0, 0) ~ (180, 255, 46), It is determined that the color category of the background pixel point is "black".
[0088] S204 determines the area of ​​the foreground area of ​​the image to be evaluated, and determines the image quality of the image to be evaluated according to the area of ​​the foreground region and the number of background pixel points belonging to the respective color categories.
[0089] In one embodiment, it is determined that the area of ​​the foreground area of ​​the image to be evaluated, and the sum of the background pixel points belonging to each color class in the background area, and the area of ​​the foreground area is compared to the area threshold, and belonging to each color. The number of background pixel points of the category is compared with the quantity threshold. If the area of ​​the foreground area is less than the area threshold, and the number of background pixel points of either color category is larger than the number of thresholds, the image quality of the image to be evaluated as the quality exception.
[0090] Optional, S204 includes two steps of the following A and B:
[0091] A. Determine the area ratio of the profile of the foreground area and the area of ​​the image to be evaluated, and the number of background pixel points belonging to the respective color categories, respectively, respectively, respectively, the number of image pixel points to be evaluated ratio.
[0092] Exemplary, assume that the area of ​​the foreground area is 50cm 2 , To be evaluated image area is 100cm 2 However, the area ratio of the foreground area is 50% between the area between the image area to be evaluated; hypothesis that the background pixel point of the image background area has "black", "white" and "gray" three color categories, black background pixel point The number is 1000, the number of pixel points in white background is 500, the number of gray background pits is 2,000, and the number of image pixel points to be evaluated is 5,000, and the number of pixel points in black background and the number ratio of the number of image pixels to be evaluated is 20%, white background, the number of pixel points and the number of image pixel points to be evaluated is 10%, and the number of gray background pixel points to the number of images to be evaluated is 40%.
[0093] Optionally, it is also possible to determine the area of ​​the foreground area corresponding to the significant region, and the area of ​​the smashed area and the ratio between the area to be evaluated as the area ratio.
[0094] Among them, the significant region can detect a significant region corresponding to the foreground region corresponding to the image in the foreground area corresponding to the image in which the image in the image is detected by including a significant detection algorithm, an AHO-CORASICK automation algorithm, an AHO-CORASICK automation algorithm or a significant detection algorithm based on color characteristics. Preferably, the external rectangle of the foreground region is extracted, and the external rectangle is directly active as a significant region corresponding to the foreground area.
[0095] B, determine the image quality of the image to be evaluated according to the area ratio and the number ratio.
[0096] In one embodiment, the area ratio is compared to the area ratio threshold, and the number ratio is compared for the number ratio threshold, and the image quality of the image to be evaluated according to the two alignment results.
[0097] By determining the area ratio between the area between the foreground area and the area of ​​the image area to be evaluated, the number ratio between the number of image pixel points to be evaluated, respectively, respectively, and the number ratio and quantity ratio are respectively It is determined that the image quality of the image to be evaluated, implements the area ratio of the area associated with the foreground area, and the number ratio of the background pixel point association determines the image quality of the image to be evaluated, and the reliability of the final image quality determination result is improved. Credibility.
[0098] Optionally, step b includes:
[0099] In the case where the area ratio is less than the area ratio threshold, and the number ratio corresponding to any color category is greater than the number ratio threshold value, the image quality of the image to be evaluated is to be evaluated as a quality exception.
[0100] Exemplary, assuming that the area ratio threshold is 30%, the number ratio threshold is 80%, and if the area ratio is 25%, while the number of white background pixel points and the number of image pixel points to be evaluated in the background area are 85%. Since the area ratio is 25% less than the area ratio threshold of 30%, and the number of corresponding number ratios is 85% greater than the number ratio threshold 80%, the image quality of the image to be evaluated is to be evaluated as the quality exception.
[0101] By determining the image quality of the image to be evaluated by the image quality of the image to be evaluated in the case of the image quality of the image to be evaluated by less than the number ratio threshold, and the number ratio ratio is greater than the number ratio threshold. Effect.
[0102] The present disclosure determines the texture energy threshold of the image to be evaluated by the texture energy value to be evaluated, according to the texture energy value of each pixel point in the image, and determines the foreground to be evaluated in accordance with the texture energy values ​​of each pixel point in the image to be evaluated, and the texture energy threshold is determined. Region, in accordance with the background area of ​​the image to be evaluated according to the foreground area, the texture energy value based on each pixel point is realized, and the adaptive texture energy threshold is achieved, quickly determine the effect of the background area of ​​the image to be evaluated, reducing the image to be evaluated The workload required for the background area improves the efficiency of determining the background area; by matching the color space value of either background pixel point in the background area, the color space value of the success is matched to the color space value of each color category. The color category to which the interval belongs, as the color category of the background pixel point, due to the color space area of ​​each color class, determine the effect of background pixel point color category, avoiding the single color space value of the color category to determine background pixels Point color category, there is an inaccurate problem; determine the image quality of the image to be evaluated by determining the area of ​​the foreground area of ​​the image to be evaluated, and determines the image quality of the image to be evaluated according to the area of ​​the foreground area and the number of background pixel points belonging to each color class. The area of ​​the foreground area, as well as the number of background pixel points belonging to each color class, two dimensions together determine the image quality of the image to be evaluated, improve the reliability and credibility of the final image quality determination result.
[0103] Applicants have found in the development process, and if the foreground area is truncated, it is easy to cause the deletion of the pixel point texture information, resulting in a gray value value assignment of some pixel points on the boundaries of the binarized image.
[0104] In order to solve the above problem, on the basis of the above embodiment, "the following A, B, and C are included before" The foreground area of ​​the image to be evaluated, "according to the two-value image of the image to be evaluated."
[0105] A. The gradation value of the binary image is the first gray value value of the first gray value value as the first type of pixel point, and the gradation value between the first type of pixel point is The pixel point of the second gray value is used as a second type of pixel point.
[0106] Wherein, the first gradation value is optionally "255", and the second gray value is optionally "0".
[0107] Exemplary, assuming pixel point A of the pixel coordinate is (1,100) is the first type of pixel point, the second type of pixel point adjacent to the pixel point A is the pixel point B, the pixel coordinate of the pixel point B is (4, 100), Then, the pixel point located between the pixel point A and the pixel point B is a second gradation value as a second type of pixel point, that is, pixel points (2, 100) and (3,100) of the pixel point as a second type of pixel. point.
[0108] B. It is determined that any of the second type of pixel point is included, and whether there is a pixel having a gradation value of the first gray value value in the pixel point set in the vertical direction of the second type of pixel point. point.
[0109] In one embodiment, it is assumed that the pixel point C is a second type of pixel point, and the pixel point c is located on the upper or lower boundary of the binarized image, it is determined that the pixel point C is vertical and the upper boundary and the lower boundary are vertical. In the pixel set, that is, the pixel column where the pixel point C is located, whether or not the gradation value is a pixel point of the first gray value value.
[0110] In another embodiment, it is assumed that the pixel point D is a second type of pixel point, and the pixel point D is located on the left boundary or right boundary of the binarized image, it is determined that the pixel point D is vertical and the left boundary and the right border are vertical. In the pixel set, that is, the pixel row in which the pixel point D is located, whether the gradation value is a pixel point of the first gray value value.
[0111] C. If there is, the gradation value of the second type of pixel point is set to the first gradation value.
[0112] In one embodiment, it is assumed that the pixel point c is a second type of pixel point, and the pixel point c is located on the upper boundary or lower boundary of the binarized image. If it is determined to include the pixel point C and vertical with the upper boundary and the lower boundary In the pixel set, that is, the pixel column where the pixel point C is in, there is a pixel point having a gradation value of the first gray value, and the gradation value of the pixel point C is set to the first gradation value.
[0113] In another embodiment, it is assumed that the pixel point D is a second type of pixel point, and the pixel point D is located on the left boundary or right boundary of the binarized image. If it is determined to include the pixel point D and vertical with the left boundary and right In the pixel set, that is, the pixel row in which the pixel point D is in, the pixel point of the gradation value is the first gray value value, the grayscale value of the pixel point D is set to the first gradation value.
[0114] By using the pixel point of the first gray value of the first gray value value, the gradation value is the first type of pixel point, and the gradation value between the respective pixel points is a second gray value value. The point is used as the second type of pixel point and determines whether or not the gradation value is the first gray value of the pixel point in the vertical direction of the second type of pixel point belonging, and is located in the vertical direction of the second type of pixel point. If the pixel point is present, the gradation value of the second type of pixel point is set to the first gradation value, solves the deletion of the pixel point texture information, resulting in the gray of some pixel points on each boundary of the binarized image. The degree value assignment is incorrect, thereby ensuring the accuracy of the subsequent determination of the background area.
[0115] image 3It is a schematic structural diagram of a determining apparatus according to an image quality disclosed in the present disclosure, and can be applied to determining a case where image quality to be evaluated. This embodiment device can be implemented using software and / or hardware, and can be integrated on arbitrary electronic devices having computational power.
[0116] Such as image 3 As shown, determination device 30 of the image quality of the present embodiment can include a background region determining module 31, a color category determination module 32, and an image quality determining module 32, wherein:
[0117] The background region determining module 31 is configured to determine the background area of ​​the image to be evaluated based on the texture energy value of each pixel point in the image to be evaluated;
[0118] The color category determination module 32 is used to determine the color category of each background pixel point in the background area;
[0119] The image quality determination module 33 is configured to determine the image quality of the image to be evaluated based on the number of background pixel points belonging to the respective color categories.
[0120] Optionally, the background area determining module 31 is specifically used:
[0121] The texture energy threshold to be evaluated image is determined based on the texture energy value of each pixel point in the image to be evaluated;
[0122] The foreground area of ​​the image to be evaluated is determined based on the texture energy value of each pixel point in the image to be evaluated, and the texture energy threshold is determined.
[0123] The background area of ​​the image to be evaluated is determined based on the foreground area.
[0124] Optionally, the background area determining module 31 is specifically used in:
[0125] According to the texture energy value of each pixel point in the image to be evaluated, the number of pixel points to evaluate the image vertical direction and the number of pixels to evaluate the image horizontal direction, determining the texture of the image to be evaluated Energy threshold.
[0126] Optionally, the background area determining module 31 is specifically used in:
[0127] The texture energy threshold to be evaluated image is determined by the following formula:
[0128]
[0129] Among them, Thres indicates the texture energy threshold value of the image to be evaluated, and S represents the scaling coefficient, and H represents the number of pixels in the vertical direction of the image to be evaluated, and W represents the number of pixels in the horizontal direction of the image to be evaluated. I (i, j) represents the texture energy value of the first column pixel point in the image to be evaluated.
[0130] Optionally, the background area determining module 31 is specifically used in:
[0131] The gradation value of the pixel point greater than or equal to the texture energy threshold is set to the first gradation value, and the gradation value of the texture energy value is less than the pixel point of the texture energy threshold is set to the second gray. The value value is obtained, the binarized image of the image to be evaluated; wherein the first gradation value is different from the second gray value;
[0132] The foreground area of ​​the image to be evaluated is determined based on the binarized image of the image to be evaluated.
[0133] Optionally, the apparatus further includes a binarized image calibration module, specifically for:
[0134] The gradation value of the binaryized image is the first gray value value of the first gray value value as the first type of pixel point, and the gradation value between the first type of pixel point is as described. The pixel point of the second gray value is used as the second type of pixel point;
[0135] It is determined whether any of the second type of pixel point is included, and whether there is a pixel point in the pixel point in the vertical direction of the second type of the second type of pixel point, and whether the gradation value is a pixel point of the first gray value;
[0136] If there is, the gradation value of the second type of pixel point is set to the first gradation value.
[0137] Optionally, the color category determination module 32 is specifically used for:
[0138] Matching the color space value of either background pixel point with the color space range corresponding to each color category, the color category of the colors belonging to the successful color spatial value section, as the color category of the background pixel point.
[0139] Optionally, the image quality determination module 33 is specifically used:
[0140] The area of ​​the foreground area to be evaluated to evaluate is determined according to the area of ​​the foreground area and the number of background pixel points belonging to the respective color categories, the image quality of the image to be evaluated.
[0141] Optionally, the image quality determining module 33 is specifically used:
[0142] It is determined that the area ratio of the foreground region and the area ratio between the image area to be evaluated, and the number of background pixel points belonging to the respective color categories, respectively, respectively, the number ratio of the number of image pixels to be evaluated;
[0143] According to the area ratio and the number ratio, the image quality of the image to be evaluated is determined.
[0144] Optionally, the image quality determining module 33 is specifically used:
[0145] In the case where the area ratio is less than the area ratio threshold, and the number ratio corresponding to any color category is greater than the number ratio threshold value, the image quality of the image to be evaluated is to be evaluated as a quality exception.
[0146] The determination apparatus 30 of the image quality disclosed in the present disclosure may perform a method of determining the image quality disclosed in the present disclosure, with a functional module and a beneficial effect of performing a method. The description described in this disclosure may be made to the description of any method according to the present disclosure.
[0147]...
[0148] According to an embodiment of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium, and a computer program product.
[0149] Figure 4 A schematic block diagram showing an example electronic device 400 that can be used to implement the embodiments of the present disclosure is shown. Electronic devices are intended to represent various forms of digital computers, such as laptop, desktop computers, workbenses, personal digital assistants, servers, blade servers, large computers, and other suitable computers. Electronic devices can also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connection and relationship, and their functions are merely examples, and are not intended to limit the implementation of the present disclosure as described in this article.
[0150] Such as Figure 4 As shown, device 400 includes calculating unit 401, which can perform various appropriate applications depending on the computer program stored in the read-only memory (ROM) 402 or from the storage unit 408 to the random access memory (RAM) 403. Action and processing. In RAM 403, various programs and data required for device 400 can also be stored. Calculation unit 401, ROM 402, and RAM 403 are connected to each other through bus 404. The input / output (I / O) interface 405 is also connected to the bus 404.
[0151] The plurality of components in the device 400 are connected to the I / O interface 405, including: input unit 406, such as a keyboard, mouse, or the like, output unit 407, such as various types of displays, speakers, etc .; storage unit 408, such as a disk, optical disk, etc. ; And communication unit 409, such as NIC, modem, wireless communication transceiver, etc. The communication unit 409 allows the device 400 to exchange information / data exchange information / data from a computer network and / or a variety of telecommunications networks such as the Internet.
[0152] Computing unit 401 can be a common and / or dedicated processing component having processing and computing power. Some examples of calculating unit 401 include, but are not limited to, a central processing unit (CPU), a graphics processing unit (GPU), various dedicated artificial intelligence (AI) computing chips, various operational machine learning model algorithms calculation unit, digital signal processing (DSP), and any suitable processor, controller, microcontroller, etc. The calculation unit 401 performs the respective methods and processing described above, such as an determination method of image quality. For example, in some embodiments, the method of determining the image quality can be implemented as a computer software program, which is tangible in a machine readable medium, such as a storage unit 408. In some embodiments, some or all of the computer program may be loaded and / or mounted to device 400 via the ROM 402 and / or communication unit 409. When the computer program is loaded into the RAM 403 and executed by the calculation unit 401, one or more steps of the method of determining the image quality described above can be performed. Alternatively, in other embodiments, calculating unit 401 can be configured to perform an determination method of image quality by other suitable means (e.g., by means of firmware).
[0153] Various embodiments of the systems and techniques described above in this article can be in digital electronic circuitry, integrated circuitry, field programmable gate array (FPGA), dedicated integrated circuit (ASIC), special standard product (ASSP), chip system System (SoC), load programmable logic (CPLD), computer hardware, firmware, software, and / or combinations thereof. These various embodiments may include: implementing in one or more computer programs, which can be performed and / or interpret on a programmable system including at least one programmable processor, the programmable processor Can be a dedicated or universal programmable processor, data and instructions can be received from the storage system, at least one input device, and at least one output device, and transmit data and instructions to the storage system, the at least one input device, and at least one An output device.
[0154] The program code for implementing the method of the present disclosure can be written any combination of one or more programming languages. These program code can provide a processor or controller for a general purpose computer, a dedicated computer, or another programmable data processing device such that the program code is performed by the functionality of the flowchart and / or block diagram when executed by the processor or controller. The operation is implemented. The program code can be performed entirely on the machine, partially executed on the machine, execute on the machine as a stand-alone software package and partially execute on the remote machine or on the remote machine or server.
[0155] In the context of the present disclosure, the machine readable medium may be a tangible medium, which may contain or store procedures for instruction execution systems, devices, or devices, or using instruction execution systems, devices, or devices. The machine readable medium can be a machine readable signal medium or a machine readable storage medium. Machine readable media can include, but are not limited to, electron, magnetic, optical, electromagnetic, infrared, or semiconductor systems, devices, or any suitable combination of the above. More specific examples of machine readable storage media include electrical connection, portable computer disc, hard disk, random access memory (RAM), read-only memory (ROM) based on one or more lines of electrical connection, read-only memory (ROM), can be programmable read-only memory (EPROM or flash memory), fiber optic, convenient compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
[0156]In order to provide the interaction with the user, the system and technique described herein can be implemented on the computer, which has a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor to display information to the user. ); And keyboards and pointing devices (eg, mouse or trackballs), users can provide input to the computer by this keyboard and the pointing device. Other types of devices can also be used to provide interactions with the user; for example, feedback to the user can be any form of sensing feedback (eg, visual feedback, audible feedback, or haptic feedback); and can be used in any form (including Acoustic input, voice input, or tactile input) to receive input from the user.
[0157] The systems and techniques described herein can be implemented in a computing system (e.g., as a data server) including the background component, or a computing system (eg, a application server) including the middleware component, or a computing system including a front end member (for example, With a user computer with a graphical user interface or a web browser, the user can interact with the system and technology described herein by this graphical user interface or the web browser), or includes this background component, an intermediate member, Or in any combination of front end components. The components of the system can be connected to each other by digital data communication (e.g., communication network) of any form or a medium. Examples of the communication network include: LAN (LAN), WAN (WAN), block chain networks, and internet.
[0158] Computer systems can include clients and servers. Clients and servers are generally away from each other and are usually interacting over a communication network. The relationship between the client and the server is generated by running on the corresponding computer and having a client-server relationship with each other. The server can be a cloud server, also known as a cloud computing server or cloud host, a host product in the cloud computing service system to solve the traditional physical host and VPS service, the existence of management is high, and the business extension is weak. defect.
[0159] It should be understood that the various forms of forms shown above can be used to reorder, increase or delete the steps. For example, the steps described in the disclosure can be performed in parallel, and may be performed sequentially, and may be performed in different order, and as long as the technical solution disclosed in the present disclosure, this paper is not limited herein.
[0160] In the above specific embodiment, it does not constitute a limit to the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations, and replacement can be made according to design requirements and other factors. Any modification, equivalent replacement and improvement, etc. within the spirit and principles of the present disclosure, should be included within the scope of this disclosure.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products