Electronic image stabilization method based on characteristic straight line of ship-borne camera system

A feature line and camera system technology, applied in the field of image processing, can solve the problems of inability to ensure high matching accuracy, difficulty in ensuring real-time image stabilization, and low parameter solution accuracy, achieving reduced feature selection, good real-time performance, and computational low-complexity effects

Inactive Publication Date: 2011-04-06
XIDIAN UNIV
1 Cites 18 Cited by

AI-Extracted Technical Summary

Problems solved by technology

This method uses the vector block segmentation method to solve the real-time problem of the optical flow method, which ensures a certain real-time performance, but there are still many shortcomings: First, the algorithm uses feature point matching, when the feature point selection When it is less, although the amount of calculation in the process is reduced, the high accuracy of matching cannot be guaranteed, resulting in low accuracy of parameter solution; when m...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Method used

In the video image of navigating process, sea and sky occupy most of the area, and there are more obvious differences in the gray scale of sea and the gray scale of sky, adopt the method for image binarization, strengthen the gray scale of sea and sky The difference needs to find a critical value as the binarization threshold.
The dividing line of the sea and the sky on the video image is exactly the sea antenna, and the reference frame successfully obtained through step 2 and the binary image of the current frame make the sea antenna displayed clearer. In the bina...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention discloses an electronic image stabilization method based on the characteristic straight line of a ship-borne camera system, belongs to the technical field of image processing and solves the instability problem of the video images of the ship-borne camera system in the sailing process of the ship. The method comprises the following specific steps: (1) selecting a reference frame, reading the current frame; (2) performing image binarization; (3) using sobel operator to perform edge detection to the binary image; (4) adopting hough transform to extract the discontinuous sea-sky-line in the image, utilizing the least square method to fit the sea-sky-line and obtain a continuous characteristic straight line; (5) performing horizontal projection to the characteristic straight line; (6) solving the equation of the straight line; (7) calculating the offset Delta y; (8) calculating the offset Delta theta; and (9) translating the current frame Delta y vertically and rotating the current frame Delta theta to realize image stabilization. The method fully utilizes the characteristic that the sea-sky-line always exists during the sailing process to reduce too many characteristic selections and characteristic matchings; and the method is characterized by good real-time, easy realization method, low computation complexity and the like.

Application Domain

Technology Topic

Horizontal projectionLeast squares +12

Image

  • Electronic image stabilization method based on characteristic straight line of ship-borne camera system
  • Electronic image stabilization method based on characteristic straight line of ship-borne camera system
  • Electronic image stabilization method based on characteristic straight line of ship-borne camera system

Examples

  • Experimental program(1)

Example Embodiment

[0026] The present invention will be further described below in conjunction with the drawings.
[0027] Step 1. Select the reference frame and read in the current frame.
[0028] In order to eliminate the shift and rotation of the image caused by the jitter of the camera system, and realize the stability of each frame of the image on the camera system, it is necessary to select a stable image from the video image that the camera system is shooting, and treat it as image stabilization Reference frame for translation and rotation in the middle. The video image sequence being shot is read into the image stabilization system one by one in order of one after another, as the current frame to be stabilized.
[0029] Step two, image binarization.
[0030] In the video image of the navigation process, the sea and the sky occupy most of the area, and the gray scale of the sea and the sky have obvious differences. The method of image binarization is used to enhance the gray scale difference between the sea and the sky. Find a critical value as the binarization threshold.
[0031] 2a) Find the binarization threshold. The image stabilization system separately sums all pixels in each frame of image, and then divides the sum by the number of total pixels, and the average gray value of the image obtained is the binarization threshold of the frame of image.
[0032] 2b) Binarization.
[0033] Reference frame binarization: Compare the pixel gray value of each point of the reference frame selected in step 1 with the binarization threshold value obtained in step 2a). If the pixel value of the point is greater than or equal to the threshold, then The pixel value of the point is set to 1, otherwise, the pixel value of the point is set to 0, and the result obtained after comparing all the pixels, in the order of the pixels of the reference frame selected in step 1, constitutes the required reference frame 2. Valued image.
[0034] Binarization of the current frame: Binarize the current frame read in step 1 according to the above-mentioned method of processing the reference frame to obtain the required current frame binarization image.
[0035] Step three, edge detection.
[0036] In the video image, the dividing line between the sea and the sky is the sea horizon. The reference frame and the current frame binarized image successfully obtained after step 2 make the sea horizon more clearly displayed. In the binarized image, the pixel value of all the pixels of the sea is 0, which means black, and the pixel value of all the pixels of the sky is 1, which means white. Therefore, the dividing line between the sea and the sky is very obvious, which makes the extraction of sea-sky horizon more obvious. easy. In order to extract the sea antenna, the image stabilization system uses the commonly used sobel operator in image processing to perform edge detection on the binarized image of the reference frame and the current frame in step 2, and obtain all the edge contours of the sea antenna in the frame image.
[0037] Step 4: Extract the discontinuous sea horizon.
[0038] Since in the edge contour image obtained in step 3, the edge line is not only the sea-sky horizon, but also many edge contours from other natural scenes or ships, and most of these contours are curves or useless short line segments. The image stabilization system uses the Hough transform method commonly used in image processing to extract straight lines, and extracts the discontinuous sea antennas.
[0039] Step 5. Fit the reference frame and the current frame sea horizon.
[0040] Since the sea-sky line extracted in step 4 is not continuous and its linear equation cannot be obtained, the image stabilization system uses the commonly used least squares mathematical method to fit all the coordinate points on the discontinuous sea-sky line into a continuous characteristic line , And output the fitted reference frame and the current frame sea-sky horizon, which is convenient for solving the linear equation parameters later.
[0041] Step 6. Horizontal projection of the reference frame and the current frame sea horizon.
[0042] Since the sea-sky line only has a vertical offset, the reference frame output in step 5 and the current frame sea-sky line are projected horizontally, and the length of the fitted reference frame and the current frame sea-sky line are not the same, so only Select part of the sea horizon for projection.
[0043] Select the points with the same abscissa in the reference frame for horizontal projection, sum up the ordinate values ​​of all projection points obtained on the coordinate axis, and divide the sum by the total number of projection points to obtain the reference frame The average coordinate value of the sea horizon. According to the steps of solving the average coordinate value of the sea-sky horizon of the reference frame, the average coordinate value of the current frame of the sea-sky horizon is solved.
[0044] Step 7. Solve the linear equation.
[0045] There is a certain amount of angle difference between the reference frame output in step 5 and the current frame sea-sky line. This difference can be obtained by the correspondence between the deflection angle and the characteristic straight line parameters. 3 points, assuming that the horizontal and vertical coordinates of these 3 points are (x1, y1), (x2, y2) and (x3, y3), respectively, these 3 points are substituted into the linear equation ax+by+c=0, the equations are combined Solve to obtain the reference frame linear equation parameter a m , B m , C m , The linear equation parameter a of the current frame n , B n , C n.
[0046] Step 8. Calculate the amount of translation.
[0047] In order to obtain the vertical offset between the reference frame output in step 5 and the current frame sea antenna, subtract the average coordinate values ​​of the reference frame and the current frame obtained in step 6, and the difference obtained is the sea antenna in the vertical direction. The amount of translation Δy.
[0048] Step 9. Find the deflection angle.
[0049] In order to obtain the angle difference between the reference frame output in step 5 and the sea-sky horizon of the current frame, it needs to be obtained by the correspondence between the deflection angle and the characteristic straight line parameter. The straight line equation of the characteristic straight line of the reference frame obtained from step 8 is:
[0050] a m x m +b m y m +c m =0 (Formula 1)
[0051] The straight line equation of the characteristic straight line in the current frame is:
[0052] a n x n +b n y n +c n =0 (Formula 2)
[0053] From formula 1 and formula 2, the deflection angle Δθ and the characteristic straight line parameter a are obtained through transformation m , B m , A n , B n Correspondence between:
[0054] sin Δθ = a m b n - a n b m a n 2 + b n 2 cos Δθ = a m a n + b m b n a n 2 + b n 2 Δθ = arctan sin Δθ cos Δθ (Formula 3)
[0055] Thus, the deflection angle Δθ of the reference frame and the characteristic line of the current frame is obtained.
[0056] Step 10. Translate and rotate.
[0057] Obtain the vertical offset Δy of the current frame image relative to the reference frame image from step 8, and obtain the deflection angle Δθ of the current frame image relative to the reference frame image from step 9, and move the current frame image vertically by Δy and rotate Δθ to obtain the final The image stabilization effect.
[0058] The effect of the present invention can be further illustrated by the following simulation.
[0059] The invention is applied to perform a matlab simulation experiment on a 50-frame video image sequence, and the effect before and after the image stabilization is compared through the peak signal-to-noise ratio (PSNR).
[0060] image 3 (a), (b), (c) are the images before the image stabilization of the first frame, the 20th frame and the 40th frame in the video image sequence, image 3 (d) and (e) are the stabilized images of the 20th and 40th frames in the image sequence. image 3 (d) Yes image 3 (b) A stable image obtained by vertically shifting 7 pixels downward and rotating 0.34° to the right, image 3 (e) Yes image 3 (c) A stable image obtained by vertically shifting 1 pixel up and rotating 1.33° to the left.
[0061] in Figure 4 In the comparison chart of peak signal-to-noise ratio (PSNR) before and after image stabilization, the following formula is used to calculate the peak signal-to-noise ratio (PSNR) before and after image stabilization:
[0062] PSNR=10log[255 2 /MSE(S 1 , S 0 )]
[0063] Among them, MSE is the minimum mean square error between the reference frame and the current frame, which is defined as:
[0064] MSE ( S 1 , S 0 ) = 1 MN X i = n n + N X j = m m + M | S 1 ( i , j ) - S 0 ( i , j ) | 2
[0065] Among them, S 0 Is the reference frame image, S 1 Is the current frame image, M and N are the image S 1 And S 0 Horizontal and vertical size, m and n are image S 1 And S 0 The size of horizontal movement and vertical movement are both 0 here.
[0066] by Figure 4 It can be seen that the peak signal-to-noise ratio (PSNR) value after image stabilization is 10.8db higher than the peak signal-to-noise ratio (PSNR) value before image stabilization on average. It can be seen that the present invention has achieved image stabilization in video sequences. good effect.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Classification and recommendation of technical efficacy words

Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products