Image processing device and method and monitoring system

An image processing device and current image technology, applied in the field of image processing, can solve problems such as determining foreground images, and achieve the effect of improving accuracy

Active Publication Date: 2017-11-28
CANON KK
4 Cites 1 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0005] However, it is difficult to determine the exact reflectance of the foreground image via the Retinex theory, therefore, an approximate value e...
View more

Abstract

The invention provides an image processing device and method and a monitoring system. The image processing device comprises a unit which is configured to obtain a current image and a background image, a unit which is configured to detect a foreground object from the current image according to the background image, a unit which is configured to determine a first parameter according to brightness information of the foreground object in the current image and the brightness information of a part corresponding to the foreground object and a unit which is configured to identify whether the foreground object is a pseudo foreground object according to the first parameter, wherein the first parameter represents a changing degree of a brightness direction of the part corresponding to the foreground object from the background image to the current image. The image processing device and method and the monitoring system can improve the precision of identifying the pseudo foreground object.

Application Domain

Image enhancementImage analysis +2

Technology Topic

Brightness perceptionBackground image +2

Image

  • Image processing device and method and monitoring system
  • Image processing device and method and monitoring system
  • Image processing device and method and monitoring system

Examples

  • Experimental program(1)

Example Embodiment

[0033] For the first embodiment described above, the hardware structure 200 includes, for example, a CPU 210, a random access memory (RAM) 220, a storage device 230, an input device 240, an output device 250, an image acquisition device 260, a network interface 270, and a system bus 280. , And image processing device 300/500/700. For the above-mentioned second embodiment, the hardware structure 200 will not include the above-mentioned image processing apparatus 300/500/700, for example. And, in this embodiment, it can be installed in the storage device 230 in advance, and will be referred to later. Picture 8 / Picture 9 The described program corresponds to the image processing of the present invention, and when the CPU 210 needs to execute the corresponding program, the installed program can be loaded from the storage device 230 to the RAM 220.
[0034] The CPU 210 is any suitable programmable control device (such as a processor, etc.), and executes various application programs stored in the storage device 230 (for example, read only memory (ROM) and/or hard disk, etc.). The various functions described. The RAM 220 is used to temporarily store programs or data loaded from the storage device 230, and is also used as a space in which the CPU 210 executes various programs, such as implementations which will be referred to below Picture 8 and Picture 9 Detailed description of the public technology, and other available functions. The storage device 230 stores many kinds of information, such as an operating system (OS), various applications, control programs, images/videos acquired by the image acquisition device 260, and data pre-stored or pre-generated by the manufacturer. The data may be the threshold value (TH) described below, for example.
[0035] The input device 240 enables the user to interact with the hardware structure 200. For example, the user can trigger the corresponding image processing of the present invention through the input device 240. And, the input device 240 may take various forms, such as a keypad or a touch screen. In addition, the output device 250 is an output interface, and when the CPU 210 or the image processing device 300/500/700 recognizes a false foreground object in the current image/frame, the output device 250 removes foreground objects other than the false foreground object. , Sent to the following reference Picture 10 The monitoring equipment.
[0036] The image acquisition device 260 is used to acquire an image/video of a surveillance place, for example, and the image acquisition device 260 can serve as an optical system of a camera.
[0037] The image processing device 300/500/700 is used to identify whether the foreground object in the current image/frame is a false foreground object.
[0038] The network interface 270 provides the following interface for connecting the hardware structure 200 to a network (such as Picture 10 The network shown in 1030, etc.). For example, the hardware structure 200 communicates with other electronic devices (such as Picture 10 The monitoring device 1020 and the like shown in) perform data communication. As another option, the hardware structure 200 may be equipped with a wireless interface for wireless data communication. The system bus 280 provides the following data transmission path. The data transmission path is used in the CPU 210, RAM 220, storage device 230, input device 240, output device 250, image acquisition device 260, network interface 270, and image processing device 300/500 /700, etc., to transfer data to each other. Although referred to as a bus, the system bus 280 is not limited to any specific data transmission technology.
[0039] The above-mentioned hardware structure 200 is merely illustrative, and is by no means intended to limit the present invention and its application or use. And, for simplicity, in figure 2 Only one hardware structure is shown in. However, multiple hardware configurations can be used as needed.
[0040] (Structure of image processing device)
[0041] Next, refer to Figure 3 to Figure 9 , To describe the structure of the image processing used in the present invention.
[0042] image 3 It is a block diagram illustrating the structure of the image processing apparatus 300 shown in FIG. 1 according to the first embodiment of the present invention. image 3 Part or all of the blocks shown in are implemented by dedicated hardware.
[0043] Such as image 3 As shown, the image processing device 300 according to the embodiment of the present invention includes: an acquisition unit 310, a foreground object detection unit 320, a parameter determination unit 330, and a foreground object recognition unit 340. In addition, image 3 The database 350 shown in is stored by figure 2 The image/video acquired by the image acquisition device 260 shown in. In addition, the database 350 also stores predefined thresholds, such as TH1 and TH2 described below. In one embodiment, the database 350 is figure 2 Storage device 230 shown in.
[0044] First, the image acquisition device 260 will acquire multiple images/videos, and store the acquired images/videos in the storage device 230 (for example, the database 350). As mentioned above, video can be regarded as a sequence of images, and therefore, the process for processing video is similar to the process for processing images. Therefore, in order to make the description simple, we will use the image as the processing object below.
[0045] Then, for the current image acquired by the image acquisition device 260, the acquisition unit 310 will acquire the current image from the image acquisition device 260 via the system bus 280. Moreover, the acquiring unit 310 will also acquire a background image from the database 350 via the system bus 280, wherein the background image is acquired according to the image acquired by the image acquiring device 260 within a predefined duration before the current image. As described above, the acquiring unit 310 will appropriately acquire the background image based on the image stored in the database 350 acquired within a certain duration before the current image, wherein the certain duration is unlimited, And set based on experimental statistics and/or experience. In this embodiment, the acquisition unit 310 uses an average image of multiple images acquired within a certain duration before the current image as the background image.
[0046] The foreground object detection unit 320 will detect the foreground object from the current image according to the background image. As described above, by comparing the current image with the background image, the moving objects or newly appeared objects in the current image will be regarded as foreground objects. In addition, the foreground object detection unit 320 may detect the foreground object from the current image via, for example, an existing background subtraction algorithm or an existing image frame difference algorithm. For example, in this embodiment, the foreground object detection unit 320 detects the foreground object from the current image via the background subtraction algorithm disclosed in patent US8305440.
[0047] After the foreground object detection unit 320 detects the foreground object, in order to recognize whether there is a false foreground object among the foreground objects, where the false foreground object is, for example, shadows or glare caused by changes in ambient lighting (such as caused by the on/off of the light source). For each foreground object, the parameter determination unit 330 will determine the above-mentioned brightness according to the brightness information of the foreground object in the current image and the brightness information of the part corresponding to the foreground object in the background image The rate of change (ie, the first parameter); where the rate of brightness change represents the degree of change in the brightness direction of the portion corresponding to the foreground object from the background image to the current image.
[0048] In addition, the parameter determination unit 330 determines the brightness change rate according to the brightness value of the visual element of the foreground object in the current image and the brightness value of the visual element of the part corresponding to the foreground object in the background image. As described above, a visual element is a visible characteristic that contributes to the performance of the current image and the background image; among them, a visual element may be, for example, a pixel, a DCT block, or a super pixel. In one embodiment, in order to easily and quickly determine the change in the brightness direction of the part corresponding to the foreground object from the background image to the current image (that is, to brighten or darken), and to determine simply and quickly The degree of change in the brightness direction, such as image 3 As shown, the parameter determination unit 330 includes a quantity determination unit 331, a maximum quantity determination unit 332, and a ratio determination unit 333.
[0049] First, the number determining unit 331 determines the first number of visual elements of the portion corresponding to the foreground object whose brightness value in the current image is greater than the corresponding brightness value in the background image, and the number determining unit 331 determines The second number of the following visual elements of the part corresponding to the foreground object, the brightness value of these visual elements in the current image is smaller than the corresponding brightness value in the background image.
[0050] Specifically, taking the first number as an example, when a visual element corresponds to one pixel, the first number is the total number of pixels in the part corresponding to the foreground object, where these pixels are in the current image. A pixel whose brightness value is greater than the corresponding brightness value in the background image. In other words, First Number=ΣNumber of pixels (First Number is the "first number", Number of pixels is the "number of pixels"), where the pixel in the formula is the pixel whose brightness value in the current image is greater than that of the background image.
[0051] In the case where one visual element corresponds to one DCT block, where each DCT block includes the same number of pixels, the first number is the total number of pixels included in the corresponding DCT block in the portion corresponding to the foreground object, where these corresponding The DCT block is a DCT block whose brightness value in the current image is greater than the corresponding brightness value in the background image. In other words, FirstNumber=(Number of pixels in one DCT block)×ΣNumber of DCT blocks (FirstNumber is the "first number", Number of pixels in one DCT block is "the number of pixels in a DCT block", Number of DCT blocks Is "the number of DCT blocks"), where the DCT block in this formula is a DCT block whose brightness value in the current image is greater than the corresponding brightness value in the background image.
[0052] In the case where one visual element corresponds to one super pixel, where each super pixel includes the same/different number of pixels, the first number is the total number of pixels included in the corresponding super pixel in the portion corresponding to the foreground object, where , These corresponding super pixels are super pixels whose brightness value in the current image is greater than the corresponding brightness value in the background image. In other words, (First Number is the "first number", Number of pixels in T super-pixel is "the number of pixels in the T super-pixel"), where the T super-pixel in the formula is that the brightness value in the current image is greater than that in the current image. The super pixel of the corresponding brightness value in the background image.
[0053] Then, the maximum number determining unit 332 determines the larger number of the first number and the second number as the maximum number, where the maximum number can reflect the number of parts corresponding to the foreground object from the background image to the current image. Changes in brightness direction.
[0054] Generally speaking, for changes in ambient lighting caused by turning on the light source, compared with the background image, the brightness value of most of the visual elements of the part corresponding to the foreground object in the current image will increase. In other words, in this case, the first number will be determined as the maximum number, and the change in the brightness direction is brightening. In addition, for changes in ambient lighting caused by turning off the light source, compared with the background image, in the current image, the brightness value of most of the visual elements of the part corresponding to the foreground object will be reduced. In other words, in this case, the second number will be determined as the maximum number, and the change in the brightness direction is darkening. In addition, for changes in environmental lighting caused by objects, compared with the background image, in the current image, the brightness value of some visual elements corresponding to the foreground object will increase, and compared with the background image, the current image , The brightness value of some visual elements in the part corresponding to the foreground object will decrease. In other words, in this situation, sometimes the first number is greater than the second number, and sometimes the second number is greater than the first number.
[0055] Then, in order to determine the degree of change in the brightness direction, the ratio determination unit 333 determines the ratio of the quotient of the maximum number and the total number of visual elements of the foreground object as the brightness change rate. In other words, (Luminance Change Ratio is "Luminance Change Ratio", Maximum Number is "Maximum Number", Total Number is "Total Number").
[0056] To Figure 4A The exemplary sample of the current image shown in is taken as an example. The dashed border represents the current image, and the block with the mark "∨" or "∧" represents the visual element of the part corresponding to the foreground object, where, in this example , A visual element corresponds to a pixel. The mark "∨" in a visual element indicates that the brightness value of the visual element in the current image is reduced compared with the background image, and the mark "∧" in a visual element indicates that the current image is compared with the background image. The brightness value of the visual element increases. So like Figure 4A As shown, first, the number determination unit 331 will determine that the first number (ie, the number of visual elements with the mark "∧") is 14, and the second number (ie, the number of visual elements with the mark "∨") is twenty two. Then, the maximum number determining unit 332 will determine the second number as the maximum number. In other words, the change in the brightness direction of the foreground object from the background image to the current image is darkening. Finally, the ratio determination unit 333 will use the formula (Luminance Change Ratio is "Luminance Change Ratio"), the luminance change ratio is determined to be 0.61.
[0057] To Figure 4B Take another exemplary sample of the current image shown in as an example, the dashed border represents the current image, and a block with a thick edge and a mark "∨" or "∧" represents the visual element of the part corresponding to the foreground object, Among them, in this example, one visual element corresponds to one DCT block, and one DCT block includes 4 pixels. So like Figure 4B As shown, the number determination unit 331 will determine that the first number (ie, the sum of the number of pixels included in the visual element with the mark "∧") is 11×4, and the second number (ie, the visual element with the mark "∨") The sum of the number of pixels included in the element) is 15×4. Then, the maximum number determining unit 332 will determine the second number as the maximum number. In other words, the change in the brightness direction of the foreground object from the background image to the current image is darkening. Finally, the ratio determination unit 333 will use the formula (Luminance Change Ratio is "Luminance Change Ratio"), the luminance change ratio is determined to be 0.58.
[0058] To Figure 4C Take another exemplary sample of the current image shown in as an example, the dashed border represents the current image, and a block with an irregular shape and a mark "∨" or "∧" represents the visual element of the part corresponding to the foreground object , Where, in this example, a visual element corresponds to a superpixel. The data (such as "40") shown in a visual element indicates the number of pixels included in the corresponding visual element. So like Figure 4C As shown, first, the quantity determination unit 331 will determine that the first quantity (ie, the sum of the number of pixels included in the visual element with the mark "∧") is (40+25+25+20+20+25=155), And the second number (ie, the sum of the number of pixels included in the visual element with the mark "∨") is (10+25+30=65). Then, the maximum number determining unit 332 will determine the first number as the maximum number. In other words, the change in the brightness direction of the foreground object from the background image to the current image is brightening. Finally, the ratio determination unit 333 will use the formula (Luminance Change Ratio is "Luminance Change Ratio"), the luminance change ratio is determined to be 0.7.
[0059] Now back to image 3 After the parameter determination unit 330 determines the first parameter (ie, the brightness change rate), for each foreground object, the foreground object recognition unit 340 will recognize whether the foreground object is a false foreground object according to the first parameter. As described above, in the case where a foreground object is caused by a light source, the brightness change rate (ie, the first parameter) is generally large. Therefore, for example, the manufacturer can first define a threshold (ie, the above-mentioned TH1) based on experimental statistics, machine learning, and/or experience, and store the threshold in image 3 Shown in the database 350. Then, the foreground object recognition unit 340 will pass figure 2 The system bus 280 shown in FIG. 1 obtains TH1 from the database 350, and when the brightness change rate is greater than or equal to TH1, the foreground object will be recognized as a false foreground object.
[0060] Finally, the foreground object recognition unit 340 will transfer the foreground objects in the current image, except the recognized false foreground objects, via the system bus 280 to figure 2 The output device 250 shown in.
[0061] As described above, in order to eliminate false recognition in some situations, the above-mentioned luminance change rate and the above-mentioned luminance variance can be used to identify false foreground objects. Figure 5 It is a block diagram illustrating the structure of the image processing apparatus 500 shown in FIG. 1 according to the second embodiment of the present invention. Figure 5 Part or all of the blocks shown in are implemented by dedicated hardware.
[0062] will Figure 5 versus image 3 Compared, Figure 5 The main difference of the image processing device 500 shown in the image processing device 500 is that the parameter determination unit 330 in the image processing device 500 will further base on the brightness information of the foreground object in the current image and the portion of the background image corresponding to the foreground object. The brightness information is used to determine the above-mentioned brightness variance difference value (ie, the second parameter); where the brightness variance difference value represents the uniformity of the surface pattern between the following two, one of which is in the current image The other is the part of the background image that corresponds to the foreground object.
[0063] In addition, the parameter determination unit 330 determines the brightness variance value according to the variance of the brightness value of the visual element of the foreground object in the current image and the variance of the brightness value of the visual element of the part corresponding to the foreground object in the background image . In one embodiment, in order to quickly determine whether the surface patterns of the foreground object in the current image and the part of the background image corresponding to the foreground object are uniform, such as Figure 5 As shown, the parameter determining unit 330 further includes a variance determining unit 334 and a difference determining unit 335.
[0064] First, the variance determining unit 334 determines the first mean variance of the brightness value of the visual element of the portion corresponding to the foreground object in the current image, and calculates the brightness of the visual element of the portion corresponding to the foreground object in the background image The second mean variance of the value. In one example, the first mean variance and the second mean variance can be calculated by the following formulas (Luminance Variance is "luminance variance", Luminance is "luminance", and Mean Variance is "mean variance"):
[0065]
[0066]
[0067] Among them, "i" is the i-th visual element of the part corresponding to the foreground object. "K" is the total number of visual elements of the part corresponding to the foreground object. "Luminance i "Is the brightness value of the i-th visual element. It is the average brightness value of all visual elements. "LuminanceVariance i "Is the variance of the brightness value of the i-th visual element.
[0068] Is the mean variance of all visual elements.
[0069] Then, the difference value determining unit 335 determines the difference value between the first mean value variance and the second mean value variance as the brightness variance difference value. In one example, the absolute difference between the first mean variance and the second mean variance is determined as the luminance variance difference value, in other words, LuminanceVarianceDifference=|FirstMeanVariance-SecondMeanVariance|(Luminance Variance Difference is "Luminance Variance Difference", First Mean Variance is "First Mean Variance", Second Mean Variance is "Second Mean Variance"). In another example, the quotient of the first mean variance and the second mean variance or the quotient of the second mean variance and the first mean variance is determined as the luminance variance difference value, in other words, (Luminance Variance Difference is "Luminance Variance Difference", First Mean Variance is "First Mean Variance", Second Mean Variance is "Second Mean Variance"), or (LuminanceVariance Difference is "Luminance Variance Difference", Second Mean Variance is "Second Mean Variance", First Mean Variance is "First Mean Variance").
[0070] In addition, such as Figure 5 As shown, after the parameter determination unit 330 determines two parameters (namely, the brightness change rate and the brightness variance value), for each foreground object, the foreground object recognition unit 340 will identify whether the foreground object is false according to these two parameters. Foreground object. As described above, in the case where the light source causes the foreground object, the brightness change rate (ie, the first parameter) is generally large, and in the case where the light source causes the foreground object, the brightness variance value (ie, the second parameter ) Is generally small. Therefore, for example, the manufacturer can first define two thresholds (ie, the above-mentioned TH1 and TH2) based on experimental statistics, machine learning, and/or experience, and store these two thresholds in Figure 5 Shown in the database 350. Then, the foreground object recognition unit 340 will pass figure 2 The system bus 280 shown in FIG. 2 obtains two threshold values ​​from the database 350, and when the brightness change rate is greater than or equal to TH1, and the brightness variance value is less than TH2, the foreground object will be recognized as a false foreground object.
[0071] For recognition by Figure 5 Whether a foreground object detected by the foreground object detection unit 320 shown in Image 6 Shows Figure 5 A detailed process of the foreground object recognition unit 340 shown in. Image 6 It is a flowchart 600 schematically showing the detailed processing of the foreground object recognition unit 340 of the present invention.
[0072] Such as Image 6 As shown, first, in step S610, since the brightness change rate is generally small when the foreground object is a real object, the foreground object recognition unit 340 will determine whether the brightness change rate is greater than or equal to TH1. If the brightness change rate is greater than or equal to TH1, the process will go to step S620. Otherwise, the foreground object recognition unit 340 will determine that the brightness change rate is relatively small, and the process will go to step S640.
[0073] In step S620, since the brightness variance value is generally small when the foreground object is caused by the light source, the foreground object recognition unit 340 will determine whether the brightness variance value is less than TH2. In the case where the luminance variance difference value is less than TH2, in step S630, the foreground object recognition unit 340 will recognize that the foreground object is a false foreground object caused by, for example, the on/off of the light source. Otherwise, the process will go to step S640.
[0074] In step S640, the foreground object recognition unit 340 will recognize that the foreground object is a true foreground object.
[0075] for Image 6 In the process shown in, the foreground object recognition unit 340 first uses the brightness change rate and then uses the brightness variance difference value to recognize whether the foreground object is a false foreground object. Alternatively, the foreground object recognition unit 340 may also use the brightness variance value first, and then use the brightness change rate. For example, the foreground object recognition unit 340 will determine whether the brightness variance difference value is less than TH2. Then, when the brightness variance value is less than TH2, the foreground object recognition unit 340 will determine whether the brightness change rate is greater than or equal to TH1. In the case where the brightness change rate is greater than or equal to TH1, the foreground object recognition unit 340 will recognize that the foreground object is a false foreground object.
[0076] In addition, due to Figure 5 The acquisition unit 310 and the foreground object detection unit 320 shown in image 3 The acquiring unit 310 and the foreground object detecting unit 320 shown in FIG. are the same, and therefore, the detailed description will not be repeated here.
[0077] Such as Figure 5 As shown, the above two parameters (ie, the brightness change rate and the brightness variance value) are determined by the same unit (ie, the parameter determination unit 330). As another option, those skilled in the art can understand that the above two parameters may also be determined by two different units. E.g, Figure 7 It is a block diagram illustrating the structure of the image processing apparatus 700 shown in FIG. 1 according to the third embodiment of the present invention. Such as Figure 7 As shown, the first parameter (ie, the brightness change rate) will be determined by the first parameter determination unit 710, and the second parameter (ie, the brightness variance value) will be determined by the second parameter determination unit 720. Wherein, the first parameter determining unit 710 may further include a quantity determining unit 331, a maximum quantity determining unit 332, and a ratio determining unit 333. image 3 The three units are described in detail. In addition, the second parameter determining unit 720 may further include a variance determining unit 334 and a difference determining unit 335. Figure 5 The two units are described in detail. In addition, due to Figure 7 The acquisition unit 310, the foreground object detection unit 320, and the foreground object recognition unit 340 shown in Figure 5 The acquisition unit 310, the foreground object detection unit 320, and the foreground object recognition unit 340 are shown in FIG. 1. Therefore, the detailed description will not be repeated here.
[0078] As described above, the brightness change rate and the brightness variance value are directly obtained from the current image and the background image, and these two parameters have actual values ​​rather than approximate values. Therefore, by using these precision parameters, the recognition accuracy of the present invention will be improved.
[0079] (Image processing process)
[0080] As mentioned above, through image 3 and Figure 5 and Figure 7 The processing performed by the structure of the embodiment shown in may also be constituted by software, and may be figure 2 The CPU 210 shown in FIG. Below, will refer to Picture 8 and Picture 9 To describe the entire image processing. Picture 8 It is a flowchart 800 schematically showing a procedure of overall image processing according to the first embodiment of the present invention. and, Picture 9 It is a flowchart 900 schematically showing a procedure of overall image processing according to the second embodiment of the present invention.
[0081] Such as Picture 8 As shown above, first of all, figure 2 The image acquisition device 260 shown in FIG. 2 will acquire a plurality of images/videos, and store the acquired images/videos in the storage device 230. As mentioned above, video can be regarded as a sequence of images, and therefore, the process for processing video is similar to the process for processing images. Therefore, in order to make the description concise, we will use the image as the processing object below.
[0082] Then, for the current image acquired by the image acquisition device 260, in the acquisition step S810, the CPU 210 will acquire the current image from the image acquisition device 260 via the system bus 280. Furthermore, the CPU 210 will also obtain a background image from the storage device 230 via the system bus 280, wherein the background image is obtained based on the image obtained by the image obtaining device 260 within a predefined duration before the current image. In this embodiment, the CPU 210 uses an average image of a plurality of images acquired within a certain duration before the current image as the background image.
[0083] In the foreground object detection step S820, the CPU 210 will detect the foreground object from the current image based on the background image. For example, in this embodiment, the CPU 210 detects the foreground object from the current image via the background subtraction algorithm disclosed in the patent US8305440.
[0084] After the CPU 210 detects the foreground object, in order to identify whether there is a false foreground object among the foreground objects, for each foreground object, in a parameter determination step S830, the CPU 210 will determine according to the brightness value of the visual element of the foreground object in the current image, And the brightness value of the visual element of the part corresponding to the foreground object in the background image to determine the first parameter (that is, the brightness change rate). Among them, the first parameter represents the degree of change in the brightness direction of the part corresponding to the foreground object from the background image to the current image, and the detailed processing of this step can be referred to image 3 The above-mentioned embodiment of the parameter determination unit 330 shown in.
[0085] After the CPU 210 determines the first parameter, for each foreground object, in the foreground object recognition step S840, the CPU 210 will recognize whether the foreground object is a fake foreground object according to the first parameter. Specifically, the CPU 210 will figure 2 The system bus 280 shown in FIG. 2 obtains the threshold value (ie, TH1) from the storage device 230, and if the first parameter is greater than or equal to TH1, the foreground object will be recognized as a false foreground object.
[0086] Finally, the CPU 210 will transfer the foreground objects in the current image, except the identified false foreground objects, to the figure 2 The output device 250 shown in.
[0087] will Picture 9 versus Picture 8 In comparison, the main difference is that Picture 9 The process shown in further includes step S910.
[0088] Such as Picture 9 As shown, after the CPU 210 determines the first parameter, in step S910, the CPU 210 calculates the variance of the brightness value of the visual element of the foreground object in the current image, and the visual value of the part of the background image corresponding to the foreground object. The variance of the brightness value of the element is used to determine the second parameter (ie, the brightness variance value). Among them, the second parameter represents the difference in the uniformity of the surface pattern between the following two, one of the two is the foreground object in the current image, and the other is the background image corresponding to the foreground object Part, and the detailed processing of this step can refer to Figure 5 The above-mentioned implementation of the parameter determination unit 330 shown in.
[0089] After the CPU 210 determines the two parameters (namely, the brightness change rate and the brightness variance value), for each foreground object, in the foreground object recognition step S920, the CPU 210 will identify whether the foreground object is false according to these two parameters. Foreground object. Among them, the detailed processing of this step can refer to Image 6 The flowchart 600 shown in the above.
[0090] (monitoring system)
[0091] As mentioned above, in figure 2 In the case where the CPU 210 or the image processing device 300/500/700 shown in FIG. 1 recognizes a false foreground object in the current image/frame, figure 2 The output device 250 shown in FIG. 1 transmits foreground objects other than fake foreground objects to the monitoring device. In other words, in figure 2 In the case where the hardware structure 200 (ie, the camera) shown in Figure 1 recognizes a false foreground object in the current image/frame, the camera will transmit the foreground objects except the false foreground object to the monitoring device. Therefore, as an exemplary application of the above-mentioned image processing, we will refer to Picture 10 To describe an exemplary monitoring system. Picture 10 The arrangement of an exemplary monitoring system 1000 according to the present invention is illustrated.
[0092] Such as Picture 10 As shown, the surveillance system 1000 according to the present invention includes a camera 1011 for monitoring the surveillance location 1, a camera 1012 for monitoring the surveillance location 2, a camera 1013 for monitoring the surveillance location 3, and a monitoring device 1020. Among them, the cameras 1011, 1012, and 1013 and the monitoring device 1020 are connected to each other via a network 1030. In addition, the network 1030 may provide the following data transmission path, which is used to transmit data between the cameras 1011 to 1013 and the monitoring device 1020 and the like. In addition, a system bus (not shown) can be used instead of the network 1030. In the surveillance system 1000, for example, the camera 1011 and the camera 1012 have the figure 2 The hardware structure 200 is the same as shown in, and the camera 1013 is a camera having the functions disclosed in the aforementioned patent application US2014/0003720. Such as Picture 10 As shown, the process between the camera 1011 and the monitoring device 1020 will be described in detail below. The process between the camera 1012/1013 and the monitoring device 1020 is similar to that of the camera 1011.
[0093] As described above, first, the camera 1011 (for example, figure 2 The image acquisition device 260 shown in Figure 1 will continuously capture images/videos of the surveillance location 1, and store the captured images/videos in its own storage device (for example, figure 2 The storage device shown in 230).
[0094] Second, the camera 1011 (for example, figure 2 The image processing device 300/500/700 shown in) will be based on the above reference Figure 3 to Figure 9 To identify whether the foreground object detected from the current image/frame is a false foreground object.
[0095] Third, when the camera 1011 recognizes that several foreground objects are false foreground objects, the camera 1011 will transmit the foreground objects excluding the recognized false foreground objects to the monitoring device 1020 via the network 1030. Finally, the monitoring device 1020 will determine whether to provide an alarm (for example, play an alarm sound) based on the received foreground object and a predefined alarm rule. Then, when the monitoring device 1020 provides an alarm, the user/monitor will execute the corresponding sub-sequence processing according to the alarm.
[0096] In violation of parking areas (such as Picture 11 As an example, the surveillance processing shown in (shown) is taken as an example, in which the camera 1011 continuously captures images/videos of the illegal parking area, and the predefined alarm rule is to provide an alarm when a car or other object is parked in the illegal parking area 1110. Picture 11 An exemplary current image 1100 of the illegal parking area 1110 taken by the camera 1011 is schematically shown, where 1120 represents a car, and 1130 represents a shadow produced by an airplane in the sky. According to the present invention, the car 1120 and the shadow 1130 will be detected as foreground objects from the current image of the illegal parking area 1110. In addition, based on the above two parameters (ie, the brightness change rate and the brightness variance value), the shadow 1130 will be recognized as a false foreground object. Therefore, the camera 1011 will transmit the foreground object (ie, the car 1120) to the monitoring device 1020 via the network 1030, and since there is no parking object in the illegal parking area 1110, the monitoring device 1020 will not provide an alarm to the monitoring person.
[0097] As described above, by using the present invention, the accuracy of identifying false foreground objects can be improved. Therefore, by using the present invention, the monitoring accuracy of the monitoring system 1000 can also be improved.

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.

Similar technology patents

Measurement method of thickness of subsurface damaged layer of bucky optical material

InactiveCN101672625AHigh precisionStrong engineering applicability
Owner:XI AN JIAOTONG UNIV

Classification and recommendation of technical efficacy words

  • High precision

Method for forecasting short-term power in wind power station

InactiveCN102102626AHigh precision
Owner:NORTH CHINA ELECTRIC POWER UNIV (BAODING) +1

Numerical control machine tool wear monitoring method

InactiveCN102091972AHigh precisionReal-time monitoring of tool wear
Owner:HUAZHONG UNIV OF SCI & TECH +1

Advertisement recommendation method and apparatus

ActiveCN104965890AHigh precisionPrecise screening
Owner:SHENZHEN TENCENT COMP SYST CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products