Camera parameter calibration method based on GPS

A camera parameter and calibration method technology, which is applied in the field of GPS-based camera parameter calibration, can solve problems such as errors and inability to fully meet accuracy requirements, and achieve the effect of improving accuracy and precision and avoiding low precision

Inactive Publication Date: 2017-12-12
NANJING UNIV OF SCI & TECH
6 Cites 18 Cited by

AI-Extracted Technical Summary

Problems solved by technology

However, the general calibration template cannot full...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention proposes a camera parameter calibration method based on a GPS, and the method comprises the steps: obtaining the longitude, latitude and height coordinates of the mass center of a GPS receiving antenna in a GPS navigation coordinate system through a GPS receiving antenna, and collecting images of the GPS receiving antenna at different positions through a camera; carrying out the coordinate conversion according to the GPS navigation coordinate system and the world coordinate system, obtaining the coordinates of the mass center of the GPS receiving antenna, and obtaining the coordinates of the mass center of the GPS receiving antenna in the images through employing a Harris corner detection algorithm; and solving a camera parameter matrix according to a camera imaging model. According to the invention, the method employs the precise coordinates of the GPS to replace a calibration template in a calibration process, avoids a calibration error caused by low precision of the calibration template, and improves the calibration precision.

Application Domain

Technology Topic

Gps navigationEnvironmental geology +4

Image

  • Camera parameter calibration method based on GPS
  • Camera parameter calibration method based on GPS
  • Camera parameter calibration method based on GPS

Examples

  • Experimental program(1)

Example Embodiment

[0025] It is easy to understand that according to the technical solution of the present invention, without changing the essential spirit of the present invention, those skilled in the art can imagine various implementations of the GPS-based camera parameter calibration method of the present invention. Therefore, the following specific embodiments and drawings are only exemplary descriptions of the technical solutions of the present invention, and should not be regarded as all of the present invention or as a limitation or limitation to the technical solutions of the present invention.
[0026] 1. The basic idea of ​​the invention
[0027] First, obtain the longitude, latitude, and altitude coordinate information of the GPS receiving antenna's centroid in the WGS-84 coordinate system through the GPS receiving antenna, and use the camera to collect images of the GPS receiving antenna at different positions;
[0028] Then, according to the relationship between the WGS-84 coordinate system and the world coordinate system, coordinate conversion is performed to obtain the coordinate information of the GPS receiving antenna centroid in the world coordinate system, and the Harris corner detection algorithm is used to obtain the GPS receiving antenna centroid in the image. coordinate;
[0029] Finally, the camera parameter matrix is ​​solved according to the camera imaging model to realize the calibration of the camera internal and external parameters.
[0030] 2. World coordinate system, camera coordinate system and image coordinate system
[0031] In the camera calibration process, in order to accurately establish a spatial position model from a two-dimensional image to a three-dimensional world, a unified coordinate system needs to be defined.
[0032] World coordinate system (X, Y, Z): The world coordinate system is a world three-dimensional coordinate system defined according to actual needs. It is used to describe the coordinate position of objects and cameras in three-dimensional space and meets the right-hand rule;
[0033] Camera coordinate system (X c ,Y c ,Z c ): The camera coordinate system is based on the optical center of the camera as the origin, Z c The axis coincides with the optical axis, is perpendicular to the imaging plane, and takes the photography direction as the positive direction. X c ,Y c Parallel to the x and y axes of the physical coordinate system of the image, and O c O is the focal length of the camera f;
[0034] Image coordinate system: Due to the working principle of the camera, the camera's image coordinate system is a coordinate system in pixels. Its origin is at the upper left and does not indicate the position of each pixel in the image based on physical units. . To solve this problem, it is necessary to establish an image coordinate system based on physical units (like the camera coordinate system and the millimeter of the physical coordinate system). Image pixel coordinate system (u, v): The image pixel coordinate system is a rectangular coordinate system with the upper left corner of the image as the origin and the pixel as the coordinate unit. u and v respectively represent the number of columns and rows of pixels in the digital image. Image physical coordinate system (x, y): The image physical coordinate system is a rectangular coordinate system with the origin of the intersection of the optical axis and the image plane and the unit of millimeters. Its x-axis and y-axis are respectively parallel to the u and v axes of the image pixel coordinate system.
[0035] 3. GPS navigation coordinate system and GPS coordinate system conversion
[0036] The GPS-based camera parameter calibration method of the present invention does not need to use a calibration template, but obtains the coordinate information of the world coordinate system of different positions through the GPS receiving antenna, and obtains the center of mass of the GPS receiving antenna in the world coordinate system through conversion between coordinate systems. The coordinate position is used to calibrate the camera parameters. The GPS receiving antenna used in the present invention is Beidouxingtong GPS, and the obtained coordinate information is the coordinates of the center of mass. figure 2 It is the image of the GPS receiving antenna at different positions, including the front view and the top view, and the center of mass is point O.
[0037] 1. GPS navigation coordinate system and navigation message data analysis
[0038] The navigation coordinate system adopted by the GPS used in the present invention is the WGS-84 coordinate system, such as image 3 Shown. This coordinate system is a protocol World Geodetic System developed by the US Department of Defense. The WGS-84 coordinate system takes the center of mass of the earth as the coordinate origin, the X axis points to the intersection of the CTP equator and the BIH10984.0 zero-degree meridian, the Z axis points to the direction of the Conventional Terrestrial Pole (CTP) defined by BIH10984.0, and the Y axis and X The axis and the Z axis constitute the right-hand coordinate system.
[0039] WGS-84 coordinate system is represented by (latitude B, longitude L, altitude H), and its content is included in the GPS navigation message, and the output format is ASCII code. The GPS message data format is as follows: $GPGGA, (1), (2), (3), (4), (5), (6), (7), (8), (9), M, (10) , M, (11), (12)*hh. From the above data, you need to extract the target's (latitude B, longitude L, height H) information, corresponding to (2) latitude (format ddmm.mmmm), (4) longitude (format: dddmm.mmmm), ( 9) Antenna elevation (sea level, -9999.9~99999.9, unit: m). For example, a frame of data received in the experiment is: $GPGGA, 085902.00, 3201.6557, N, 11.851.4286, E, 4, 21, 0.6, 55.51, M, 1.20, M, 01, 908Z*02, the corresponding (latitude B , Longitude L, height H) information is: (32.02759445887, 118.85714386997, 55.5143).
[0040] 2. Conversion between GPS navigation coordinate system and system world coordinate system
[0041] When the GPS navigation coordinate system (B, L, H) information is known, the mutual conversion between various coordinate systems can be carried out.
[0042] First, convert the GPS navigation coordinate system to the earth rectangular coordinate system. Let P at any point on the surface of the earth be P in the GPS navigation coordinate system G (B,L,H), P in the Cartesian coordinate system of the earth E (X E ,Y E ,Z E ). Then the relationship between the GPS navigation coordinate system and the earth rectangular coordinate system is:
[0043]
[0044] In formula (1), N is the radius of curvature of the ellipsoid, and E is the first eccentricity of the ellipsoid. Suppose the long radius of the earth is a=6378137m, and the short radius is b=6356752m, then:
[0045]
[0046]
[0047] After getting the earth's rectangular coordinate system, it must be converted to the world coordinate system P(X,Y,Z). The world coordinate system takes the GPS receiving antenna master station O as the coordinate origin. Suppose the GPS receiving antenna main station O is P in the Cartesian coordinate system of the earth E0 (X E0 ,Y E0 ,Z E0 ), the GPS receiving antenna from station A is P in the earth's rectangular coordinate system E (X E ,Y E ,Z E ), the coordinates of the slave station A relative to the master station O are:
[0048]
[0049] Fourth, the camera imaging model
[0050] Figure 4 For the ideal camera imaging model—pinhole camera model, the relationship between the camera coordinate system and the image coordinate system is described. In the pinhole camera model, the point P with spatial coordinates (X, Y, Z) is mapped to a point p on the image plane Π, and point p is the line and image connecting point P and the projection center (camera optical center) C The intersection of the planes. The vertical line from the optical center C of the camera to the image plane is the main axis of the camera, and the intersection point of the main axis and the image plane is the main point c(x 0 ,y 0 ), the camera optical center C and principal point c(x 0 ,y 0 The distance between) is the camera focal length f.
[0051] by Figure 4 Available camera coordinate system (X c ,Y c ,Z c The relationship between) and the image coordinate system (u, v) is:
[0052]
[0053] In formula (4), Are the scale factors on the u and v axes, or called the normalized focal length on the u and v axes; s is the scale factor, Is the camera internal parameter matrix, and I is the identity matrix.
[0054] by Figure 5 The camera coordinate system (X c ,Y c ,Z c Euclidean transformation relationship between) and the world coordinate system (X, Y, Z):
[0055]
[0056] In formula (5), R is a 3×3 rotation matrix; t is a 3×1 translation matrix.
[0057] From equations (4) and (5), the relationship between the image coordinate system (u, v) and the world coordinate system (X, Y, Z) can be obtained:
[0058]
[0059] In formula (6), M=A[R|t] is the camera parameter matrix, where, Is the camera internal parameter matrix, [R|t] is the camera external parameter matrix, R is the rotation matrix, and t is the translation matrix. The calibration of camera parameters is the process of solving each parameter in the M matrix.
[0060] 5. GPS-based camera parameter calibration
[0061] The camera imaging model equation (6) is expressed as the following form:
[0062]
[0063] In formula (8), s i Is the scale factor, Is the homogeneous coordinate representation of the world coordinate system with the centroid of the GPS receiving antenna at different positions, Is the homogeneous image coordinate corresponding to the centroid of the GPS receiving antenna in the captured image, m ab Is the element in row a and column b of M matrix.
[0064] Expanding the formula (8), we get:
[0065]
[0066] Eliminate s in (9) i , Then:
[0067]
[0068] For n known world coordinate system coordinates And the corresponding image coordinate system coordinates The direct linear transformation (DLT) method can be used to solve each element in the M matrix, namely:
[0069]
[0070] Let m in (11) 34 =1, to get the matrix m 11 ~m 33 2n linear equations of the elements. make:
[0071]
[0072]
[0073]
[0074] Equation (11) can be rewritten as:
[0075] Km=U (13)
[0076] The least square method can be used to find m in formula (13), namely:
[0077] m=(K T K) -1 K T U (14)
[0078] Then each element in the M matrix in equation (6) can be calculated.
[0079] Sixth, the concept of solving the internal and external parameters of the camera
[0080] After obtaining m in equation (13), all the internal and external parameters of the camera can be obtained in turn by the relationship between the M matrix and the internal and external parameters of the camera.
[0081] From equation (8), the relationship between the M matrix and the internal and external parameters of the camera is:
[0082]
[0083] In formula (15), Is the row vector composed of the first three elements of the a-th row of the M matrix; m a4 (a=1, 2, 3) is the element in the ath row and fourth column of the M matrix; Is the ath row of the rotation matrix R in the camera external parameters; t x ,t y ,t z Are the three components of the translation vector t in the camera external parameters.
[0084] From equation (15), we can get:
[0085]
[0086] Known from formula (16): m 34 m 3 =r 3 , Due to r 3 Is the third row of the orthogonal identity matrix, then |r 3 |=1. Therefore: Then calculate the other parameters in the M matrix according to equation (17).
[0087]
[0088]
[0089]
[0090] Where x 0 ,y 0 ,α x ,α y Is the camera internal parameter matrix Parameter in r 1 ,r 2 ,r 3 Form the rotation matrix R, t in the camera external parameters x ,t y ,t z Are the three components of the translation vector t in the camera external parameters.
[0091] According to formula (17), the camera's internal parameter matrix And each parameter in the camera's external parameter matrix [R|t] can be calculated.
[0092] In summary, from more than 6 known points in the space and their corresponding image coordinates, the M matrix can be obtained, and the internal and external parameters of the camera can be obtained according to equation (17).
[0093] Seven, a process for implementing the method of the present invention
[0094] Step 1: Place GPS receiving antenna master station O and slave station A, connect the radio, GPS and computer, make GPS in RTK (Real-time kinematic, carrier phase difference technology) working mode, record the longitude and latitude of GPS receiving antenna master station O Height information P G0 (B 0 ,L 0 ,H 0 );
[0095] Step 2: Move the GPS receiving antenna from station A to different locations, and try to make the moving range cover the entire field of view. Use the camera to collect images I of the GPS receiving antenna from station A at different locations (i=1,2,.. .,n), and record the latitude and longitude information of the GPS receiving antenna at different locations
[0096] Step 3: According to formulas (1)~(3), the latitude and longitude height information Convert to world coordinate system coordinate P i (X i ,Y i ,Z i )(i=1,2,...,n);
[0097] Step 4: Perform image preprocessing such as binarization and smoothing filtering on the collected image I (i=1, 2,...,n);
[0098] Step 5: Use Harris corner detection algorithm to extract the centroid coordinates (x) of the GPS receiving antenna in the image I (i=1, 2,...,n) i ,y i )(i=1,2,...,n);
[0099] Step 6: Calculate the M matrix according to the aforementioned formulas (8) to (14), and calculate the internal and external parameters of the camera according to formulas (15) to (17).
[0100] The beneficial effects of the present invention can be further illustrated by the following experiments:
[0101] This experiment uses Beidouxingtong GPS, the camera is Basler acA640-90gc, the CCD size is 4.88mm×3.66mm, the resolution is 658×492, and the lens focal length is F=12mm. Experimental device such as Image 6 Shown.
[0102] In the experiment, according to the specific steps of the method of the present invention, images I (i=1, 2,...,20) of different positions of the GPS receiving antenna in the field of view are taken, and the corresponding GPS receiving antenna centroid coordinates are recorded According to formula (1)~(3), the latitude and longitude information Convert to system coordinate system coordinate P i (X i ,Y i ,Z i )(i=1, 2,...,20). The Harris corner detection method is used to extract the GPS receiving antenna centroid position coordinates (u i ,v i )(i=1, 2,...,20). According to equations (8) to (14), the M matrix is ​​obtained, and the camera's internal and external parameters are obtained from equations (15) to (17). The calculated M matrix is:
[0103]
[0104] Decompose the M matrix and get the camera internal parameters as:
[0105] f x =1659.57,f y =1650.87
[0106] c x =337.23,c y =239.47
[0107] The camera external parameter matrix [R|t] is:
[0108]
[0109] The above experiment proves that the camera parameter calibration method proposed by the present invention can obtain high-precision spatial coordinate information through GPS, and avoids the calibration error caused by the low precision of the calibration template. Compared with traditional camera calibration methods, this method can improve the accuracy and precision of camera parameter calibration.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Method and system for predicting shale formation pressure

ActiveCN109306866AReliable theoretical basisAvoid low precisionBorehole/well accessoriesShale gasReservoir pressure
Owner:CHINA PETROLEUM & CHEM CORP +1

Accurate measurement positioning method based on calibration algorithm

PendingCN112767494AAvoid cumbersome processAvoid low precisionImage enhancementImage analysisCalibration algorithmMeasurement precision
Owner:YUNNAN ASTRONOMICAL OBSERVATORY CHINESE ACAD OF SCI +1

Processing method and system for casting reference

ActiveCN111531405AAvoid low precisionBenchmark consistentMeasurement/indication equipmentsManufacturing engineeringInformation acquisition
Owner:SHANGHAI JIAO TONG UNIV +1

Classification and recommendation of technical efficacy words

  • Avoid low precision
  • Improve accuracy and precision

Patrolling robot mechanical arm tail end space positioning method

InactiveCN108908344AEliminate navigation and motion errorsAvoid low precisionProgramme-controlled manipulatorDistance sensorsEngineering
Owner:YUNNAN POWER GRID CO LTD KUNMING POWER SUPPLY BUREAU
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products