[0064] Example one
[0065] See figure 1 , Is a vehicle speed measurement method provided by an embodiment of the present invention, including the following steps:
[0066] Step S101: The monitoring machine receives multiple frames of images collected by the close-up camera at a preset frequency for the vehicle to be tested.
[0067] Because the time for the vehicle to pass through the field of view is very short, generally less than 1 second, in order to ensure real-time requirements, it is preferable that the close-up camera collects 25 frames of images per second, and the interval between the close-up camera images is required to be as equal as possible, so that It uses the number of frames passed by the vehicle to estimate the time spent.
[0068] Step S102: The monitoring machine calculates the license plate running trajectory of the multi-frame images, determines whether the license plate running trajectory is correct, and selects a multi-frame image with the correct license plate running trajectory as the image group to be analyzed.
[0069] In the embodiment of the present invention, the position of the license plate is used to identify the position of the vehicle to be tested. Here, judging whether the running track of the license plate is correct is mainly used to eliminate interference, that is, if the running track of the license plate is judged to be incorrect, it is considered as interference, and the incorrect image is not analyzed.
[0070] The method of selecting multiple frames of images with the correct trajectory of the license plate as the image group to be analyzed, preferably, the adaptive virtual coil setting technology can be adopted, and two virtual coils are adaptively set according to the judgment result of the correctness of the vehicle trajectory. One virtual coil is used to identify the range in which the vehicle enters, and the other virtual coil is used to identify the range in which the vehicle leaves. Then the two virtual coils that are set adaptively select the correct multi-frame image of the license plate running track, which is used as the analysis to be analyzed. Image group. In practical applications, the setting of the virtual coil can not only consider the factor of the vehicle leaving, but also the factor of the vehicle stopping.
[0071] In the embodiment of the present invention, the method for calculating the license plate running trajectory of the multi-frame images to determine whether the license plate running trajectory is correct can be determined by judging whether the license plate numbers are consistent to determine whether the running trajectory is correct, or by judging whether the license plate running directions are consistent To determine whether the trajectory is correct, you can also determine whether the trajectory is correct by judging whether the imaging distance of the license plate between the images is close, and so on. In practical applications, the above judgment methods can be used together to further improve the accuracy of vehicle speed measurement.
[0072] Specifically, the method for determining whether the license plate numbers are consistent is to determine whether the license plate numbers are consistent or not: identifying whether the license plate numbers in the multiple frames of images are consistent, and if they are consistent, determine that the multiple frames of images are correct Image; if there are inconsistencies, select consecutive images with the same license plate number as the correct image of the license plate running track.
[0073] Among them, the method of recognizing the license plate number in the image includes the following sub-steps:
[0074] Image smoothing processing is performed on the image, including horizontal smoothing and vertical smoothing; preferably, Gaussian smoothing algorithm can be used to achieve this.
[0075] Perform license plate rotation correction processing on the smoothed image, which is used to correct license plate tilt or character tilt, including horizontal tilt correction processing and vertical tilt correction processing; evaluate the tilt angle, and perform rotation correction for the acquired tilt angle. Preferably, Rotation correction can be achieved by using bilinear interpolation.
[0076] The pictures processed by rotation correction are segmented by vertical projection and connected marks to extract characters; among them, vertical projection is used to analyze the arrangement of characters, and connected marks are used to extract characters.
[0077] Perform character recognition on the extracted characters, preferably, extract the stroke characteristics of the characters, use a trainer for training, and finally use a voting mechanism to determine the type of the character to complete the character recognition.
[0078] In addition, the method of judging whether the license plate numbers are consistent is specifically: comparing the contents of the corresponding digits of the license plate number digit by digit whether all meet the preset matching correspondence, if yes, it is determined that the license plate numbers are consistent; the matching correspondence is used to record letters with similar shapes , The correspondence between numbers, for example, there is a matching correspondence between the letter B and the number 8, and there is a matching correspondence between the number 0 and the letter D and the letter Q, etc.
[0079] Specifically, the method to determine whether the running trajectory of the license plate is correct by judging whether the running direction of the license plate is consistent is:
[0080] Use the license plate location algorithm to calculate the position of the license plate in the multi-frame image, compare the position of the license plate in each frame of the image frame by frame, and determine whether the running direction of the license plate in the multi-frame image is consistent. If they are consistent, determine the multiple The frame image is the image of the correct trajectory of the license plate; if there are inconsistencies, multiple consecutive frames of images with the same running direction are reselected as the image of the correct trajectory of the license plate. Preferably, the license plate location algorithm is specifically: taking advantage of the dense edges in the license plate, using the sobel operator to obtain the edge position of the license plate.
[0081] Here, the license plate location algorithm is used to obtain the position of the license plate in each frame of image to determine the position of the vehicle to be tested, and combined with the recognition results in the previous and subsequent frames, to determine whether the running direction of the license plate is consistent.
[0082] Specifically, by judging whether the imaging distance of the license plate is similar to determine whether its running track is correct:
[0083] The license plate location algorithm is used to calculate the imaging distance of the license plate in the adjacent frame image, and judge whether the imaging distance is similar. If they are all similar, it is determined that the multi-frame image is the correct image of the license plate running track; if there are not similar, then reselect one of them The continuous multiple frames of images with similar imaging distances are used as the correct image of the license plate running track.
[0084] Step S103: The monitor tracks and calculates the license plate imaging distance of the multiple frames of images in the image group to be analyzed, and calculates the actual distance corresponding to the imaging distance according to the corresponding relationship between the real world coordinate system and the imaging coordinate system, and divides the actual distance by the actual distance The time difference between the corresponding images calculated according to the preset frequency is used to obtain the current tracking vehicle speed, and the multiple tracking vehicle speeds are averaged to obtain the vehicle speed of the vehicle to be tested.
[0085] There are many ways to track and calculate the license plate imaging distance of multiple frames of images in the image group to be analyzed. Among them, a preferred tracking algorithm may be: tracking and calculating the image from the first frame to the current image in the image group to be analyzed Track the imaging distance of the license plate of the image.
[0086] See figure 2 , Shows the detailed steps of tracking and calculating vehicle speed:
[0087] Step S201: After judging that a vehicle has entered, the license plate is tracked from the first frame of the image group to be analyzed.
[0088] Step S202: tracking and calculating the imaging distance from the vehicle entering image, that is, the first frame image of the image group to be analyzed, to the current tracking image and the license plate.
[0089] Step S203: Calculate the actual distance corresponding to the imaging distance according to the corresponding relationship between the real world coordinate system and the imaging coordinate system.
[0090] Step S204: Using the number of frames from the vehicle entering image to the current tracking image, multiplying the interval between frames to obtain the time from the vehicle entering image to the current tracking image.
[0091] Step S205: Divide the actual distance by the time spent to obtain the current tracking speed, and record it.
[0092] Step S206: Determine whether to trigger the result output, if yes, execute step S207, otherwise update the current tracking image to the next frame image, and execute step S202 again.
[0093] Judging whether to trigger the result output can be achieved by judging whether the current tracking image enters the virtual coil used to identify the vehicle leaving.
[0094] Step S207: averaging the obtained tracking speeds to obtain the vehicle speed of the vehicle to be tested. Since the time for vehicles to pass is very short, less than 1 second, it fully meets the real-time requirements.
[0095] Of course, it can also track the license plate imaging distance of two adjacent frames of images before and after, and so on. The embodiment of the present invention does not limit the preset tracking rule.
[0096] In the embodiment of the present invention, the close-up camera is set directly above the traveling route of the vehicle to be tested, and the optical axis of the close-up camera is on the traveling route of the vehicle to be tested to ensure that the optical axis of the close-up camera and the traveling track of the vehicle to be tested are approximately on the same plane Therefore, there is the following reciprocal relationship: the angle between the license plate position in the real world to the optical center of the close-up camera and the optical axis is equal to the direction from the license plate imaging position in the imaging world to the optical center of the close-up camera lens and the optical axis. In addition, the trajectory of the vehicle under test formed by tracking, and the figure formed between the optical center of the close-up camera is an isosceles triangle, see Figure 3-2 , Where the position of the optical center of the close-up camera is the vertex of the imaging triangle.
[0097] Combine Figure 3-1 with Figure 3-2 Introduce a preferred method for determining the corresponding relationship between the real world coordinate system and the imaging coordinate system:
[0098] See Figure 3-1 , Shows the scene of vehicle speed measurement in the real world, where the x-axis represents the direction of the vehicle, Figure 3-2 Show with Figure 3-1 The imaging scene corresponding to the middle speed measurement scene, combined with Figure 3-1 with Figure 3-2 Introduce a method for calculating the actual distance corresponding to the imaging distance based on the corresponding relationship between the real world coordinate system and the imaging coordinate system, specifically:
[0099] Figure 3-1 Three parameters are calibrated in, h is the height of the close-up camera from the ground, v1 is the distance between the boundary point V1 and the projection point of the close-up camera on the vertical ground in the shooting angle of the close-up camera in the shooting scene, and v2 is the shooting The distance between the boundary point V2 of the scene farthest from the camera position and the projection point of the close-up camera on the vertical ground.
[0100] Where A is the position of the close-up camera, OP is the optical axis of the close-up camera, that is, OP is the angular bisector of ∠V1AV2, the corresponding angle big_angle is the maximum shooting angle of the close-up camera, and the angle small_angle is the minimum shooting angle of the close-up camera. Choose two test points D1 and D2, the corresponding distance d1 is the distance between the test point D1 and the vertical projection point of the close-up camera, d2 is the distance between the test point D2 and the vertical projection point of the close-up camera, and the angle cur_angle1 is close-up The camera's shooting angle of view to the test point D1, and the angle cur_angle2 is the close-up camera's shooting angle of view to the test point D2.
[0101] Figure 3-2 Shows a schematic diagram of the imaging world formed between the trajectory of the vehicle under test and the close-up camera in the multiple frames of images in the image group to be analyzed, where P, V1, V2, D1, D2, and A correspond to Figure 3-1 Corresponding points in the real-world shooting scene shown. Since the optical axis of the close-up camera is on the route of the vehicle under test, Figure 3-2 The three points V1, V2, and A form an isosceles triangle, where AP is the optical axis in the imaging world, that is, P is the midpoint of V1 and V2. The distance between V1 and V2 is H. The distance between V1 and D1 is c.
[0102] according to Figure 3-1 , The maximum angle of view of the close-up camera big_angle is equal to the arc sine of h/v1, that is, big_angle=atan(h/v1), and the minimum angle of view of the close-up camera small_angle is equal to the arc sine of h/v2, that is, small_angle=atan(h/v2).
[0103] Because in the embodiment of the present invention, in the scene of vehicle speed measurement, the close-up camera is directly above the driving route of the vehicle, and the optical axis of the close-up camera is the driving route of the vehicle under test, that is, the optical axis of the close-up camera and the driving of the vehicle under test The trajectory is approximately in a plane, so there is the following equivalence relationship:
[0104] Using the angle from the position of the license plate in the real world to the optical center of the close-up camera and the optical axis is equal to the equivalence relationship between the direction from the position of the license plate in the imaging world to the optical center of the close-up camera lens and the angle of the optical axis. Calculation Figure 3-1 The method for the distance OD1 between the test point D1 and the vertical projection point of the close-up camera in the real world is:
[0105] according to Figure 3-2 , The angle between the direction from the license plate position D1 to the close-up camera A and the optical axis is tmp_angle1, then tan(tmp_angle1)=PD1/PA; and tan(∠PAV1)=PV1/PA;
[0106] Since PD1/PV1=(H-2×c)/H, formula 1:
[0107] tan(tmp_angle1)/tan(∠PAV1)=(H-2×c)/H formula 1
[0108] in Figure 3-1 According to the complementary angle equal to the sum of two non-adjacent internal angles, there is an equation shown in Equation 2:
[0109] big_angle=∠V1AV2+small_angle formula 2
[0110] Among them, 2×∠PAV1=∠V2AV1 Equation 3
[0111] due to Figure 3-1 In the real world shown ∠PAV1 and Figure 3-2 In the imaging world shown, ∠PAV1 is equal, then combining equations 1, 2, and 3 to obtain:
[0112] tan(tmp_angle1)/tan(big_angle/2-small_angle/2)=(H-2×c)/H
[0113] That is, tmp_angle=atan((H-2×c)/H×tan(big_angle/2-small_angle/2)) Equation 4
[0114] in Figure 3-1 For the test point D1, the shooting angle cur_angle1 of the close-up camera is: cur_angle1=small_angle+∠D1AV2.
[0115] Similarly, Figure 3-1 In the real world shown ∠D1AP and Figure 3-2 In the imaging world shown, tmp_angle1 is equal, so it can be derived as follows:
[0116] cur_angle1=small_angle+∠D1AV2=small_angle+(tmp_angle1+∠PAV2)
[0117] =small_angle+(tmp_angle 1+(∠V1AV2)/2)
[0118] =small_angle+(tmp_angle 1+(big_angle-small_angle)/2)
[0119] =big_angle/2+small_angle/2+tmp_angle 1 formula 5
[0120] Then formula 5 can get cur_angle 1 according to tmp_angle1 obtained by formula 4.
[0121] Therefore, combine Figure 3-1 , You can use Equation 6 to calculate the distance between test point D1 and point O:
[0122] OD1=h/tan(cur_angle1) Equation 6
[0123] For test point D2, based on the same method as test point D1, the distance OD2 between test point D2 and point O can be calculated.
[0124] The difference between the distance OD1 and OD2 between the vertical projection point of the close-up camera and the license plate in the real world is the actual distance between the test points D1 and D2 in the real world coordinate system. Calculate the imaging distance between the test points D1 and D2 in the imaging coordinate system using the steps of S201 to S203 above, then use the actual distance between the test points D1 and D2 in the real world coordinate system and the test points D1 and D2 The proportional relationship of the imaging distance in the imaging coordinate system can obtain the corresponding relationship between the real world coordinate system and the imaging coordinate system.
[0125] It can be seen that the method for determining the correspondence between the real world coordinate system and the imaging coordinate system provided by the embodiment of the present invention requires simple parameters. It only needs the height of the close-up camera and the distance from the current shooting position to the vertical ground projection point of the close-up camera. simple.
[0126] Further, the vehicle speed measurement method provided by the embodiment of the present invention can also identify the background color of the license plate from the image. The specific method is as follows:
[0127] Collect data between characters;
[0128] The brightness information Y and two color difference information U and V are extracted from the data collected in the character gap, and color analysis is directly performed on the brightness information and the two color difference information to obtain color values.
[0129] It can be seen that the vehicle speed measurement method provided by the embodiment of the present invention proposes a technology based on a close-up camera to realize monocular visual speed measurement. Only one close-up camera is needed in the vehicle speed measurement, no additional vehicle speed detection equipment, and no need for the cooperation of multiple cameras. , Easy installation, simple equipment, more flexible and diverse applicable scenarios; and, by judging the correctness of the license plate running track, and using the tracking algorithm to calculate the speed, it has the advantages of strong robustness and high speed measurement accuracy. According to the vehicle speed measurement method provided by the embodiment of the present invention, the accuracy of the measured vehicle speed is above 95%.
[0130] In addition, the speed measurement method is fast and accurate. It is not limited by the ground sensing coil, does not need to dig the ground, does not require radar and other equipment, effectively overcomes the error caused by mutual interference between vehicles, and accurately obtains the vehicle's speed while measuring the speed. The license plate number provides an accurate basis for traffic law enforcement.