Unmanned vehicle, and unmanned vehicle positioning method, device and system

An unmanned vehicle and positioning technology, applied in the field of unmanned vehicles and vehicle engineering, can solve the problems of inability to provide high-precision and high-stability positioning results, large positioning errors, etc., and achieve the effect of avoiding large positioning errors and accurate positioning.

Active Publication Date: 2016-10-12
BAIDU ONLINE NETWORK TECH (BEIJIBG) CO LTD
4 Cites 58 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0005] However, the existing real-time differential positioning method will generate large positioning errors when the GPS satellite signal is...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Method used

Because the map area divided in step 310 may cover larger area scope, in the process of generating laser point cloud reflection value map, the laser spot quantity in each map area may have sizable order of magnitude, causes positioning processing The amount of calculation is large. In addition, when the map area may cover a large area, the accuracy of the positioning result obtained based on the map is low. Therefore, in this step 320, each map area can be further subdivided, thereby reducing the calculation amount of positioning processing and improving the positioning accuracy of the positioning result.
By determining the first matching probability of the laser point cloud projection data and each area of ​​the predetermined range within a predetermined range comprising the prior positioning position in the laser point ...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention discloses an unmanned vehicle and an unmanned vehicle positioning method, device and system. The method comprises that first laser point-cloud reflection value data which matches the present position of the unmanned vehicle is obtained, and the first laser point cloud reflection value data comprises first coordinates of laser points and laser reflection intensity values corresponding to the laser points; the first laser point cloud reflection value data is converted to laser point cloud projection data in a horizontal plane; a first matching probability of the laser point cloud projection data within a predetermined range in the laser point cloud reflection value map is determined by taking the pre-determined position of a prior positioning position in a laser point cloud reflection value map as an initial position; and the position of the unmanned vehicle in the laser point cloud reflection value map is determined on the basis of the first matching probability. Thus, the present position of the unmanned vehicle can be positioned accurately.

Application Domain

Technology Topic

Image

  • Unmanned vehicle, and unmanned vehicle positioning method, device and system
  • Unmanned vehicle, and unmanned vehicle positioning method, device and system
  • Unmanned vehicle, and unmanned vehicle positioning method, device and system

Examples

  • Experimental program(1)

Example Embodiment

[0021] The application will be further described in detail below with reference to the drawings and embodiments. It can be understood that the specific embodiments described here are only used to explain the related invention, but not to limit the invention. In addition, it should be noted that, for ease of description, only the parts related to the relevant invention are shown in the drawings.
[0022] It should be noted that the embodiments in the application and the features in the embodiments can be combined with each other if there is no conflict. Hereinafter, the present application will be described in detail with reference to the drawings and in conjunction with embodiments.
[0023] figure 1 Shows an exemplary system architecture 100 that can apply the embodiment of the unmanned vehicle positioning method based on laser point cloud reflection value matching or the unmanned vehicle positioning device based on laser point cloud reflection value matching of the present application.
[0024] Such as figure 1 As shown, the system architecture 100 may include an unmanned vehicle 101, a network 102, and a server 103. The network 102 is used as a medium for providing a communication link between the unmanned vehicle 101 and the server 103. The network 102 may include various connection types, such as wired, wireless communication links, or fiber optic cables.
[0025] The unmanned vehicle 101 can interact with the server 103 through the network 102 to receive or send messages and so on. The unmanned vehicle 101 may be equipped with a laser point cloud collection device, a communication device, a processor, etc.
[0026] The server 103 may be a server that provides various services, for example, a server that processes the laser point cloud reflection value data collected by the unmanned vehicle 101. The server 103 may analyze and process the received laser point cloud reflection value data, and feed back the processing result (for example, the positioning information of the unmanned vehicle) to the unmanned vehicle 101.
[0027] It should be noted that the unmanned vehicle positioning method based on laser point cloud reflection value matching provided by the embodiments of the present application may be executed by the unmanned vehicle 101, or executed by the server 103, or some steps may be executed by the unmanned vehicle 101 and others. Some steps are executed by the server 103. Correspondingly, the unmanned vehicle positioning device based on laser point cloud reflection value matching can be set in the server 103, or set in the unmanned vehicle 101, or part of the modules are set in the server 103 and the other part is set in the unmanned vehicle 101 .
[0028] Should understand, figure 1 The numbers of unmanned vehicles 101, networks 102, and servers 103 in are only illustrative. There may be any number of unmanned vehicles 101, networks 102, and servers 103 according to implementation needs.
[0029] Continue to refer figure 2 , Shows a process 200 of an embodiment of an unmanned vehicle positioning method based on laser point cloud reflection value matching according to the present application. The unmanned vehicle positioning method based on laser point cloud reflection value matching includes the following steps:
[0030] Step 210: Obtain the first laser point cloud reflection value data matching the current position of the unmanned vehicle. The first laser point cloud reflection value data includes the first coordinates of each laser point and the laser reflection intensity value corresponding to each laser point in the first laser point cloud reflection value data.
[0031] The reflection value of the laser point cloud is the reflection intensity after the laser point irradiates the object, and its value range may be, for example, 0-255.
[0032] In some optional implementations, if the unmanned vehicle positioning method based on laser point cloud reflection value matching of this embodiment is applied to the device on it: figure 1 For the unmanned vehicle in, the laser point cloud reflection value collection device installed on the unmanned vehicle can collect the first laser point cloud reflection value data of the current position of the unmanned vehicle.
[0033] Or, in other optional implementation manners, if the unmanned vehicle positioning method based on laser point cloud reflection value matching in this embodiment is applied to the device on it figure 1 The laser point cloud reflection value collection device of the unmanned vehicle can collect the first laser point cloud reflection value data of the current position of the unmanned vehicle and upload it to the server through a wired or wireless connection to realize the first Acquisition of laser point cloud reflection value data. It should be pointed out that the above-mentioned wireless connection methods may include but are not limited to 3G/4G connection, WiFi connection, Bluetooth connection, WiMAX connection, Zigbee connection, UWB (ultrawideband) connection, and other currently known or future wireless connection methods.
[0034] Step 220: Convert the first laser point cloud reflection value data into laser point cloud projection data in the ground plane.
[0035] By projecting the first laser point cloud reflection value data to the ground plane, the laser point cloud reflection value of each coordinate position in the three-dimensional space obtained in step 210 can be converted into the laser point cloud reflection value of each coordinate position in the ground plane .
[0036] Step 230: Using the predetermined a priori positioning position in the laser point cloud reflection value map as the initial position, determine the first matching probability of the laser point cloud projection data within a predetermined range of the laser point cloud reflection value map.
[0037] Here, the a priori positioning position may be the current position of the unmanned vehicle determined by other positioning methods, or it may also be the current position of the unmanned vehicle predicted by a certain prediction algorithm.
[0038] By determining the first matching probability between the laser point cloud projection data and each area of ​​the predetermined range within a predetermined range of the laser point cloud reflection value map containing the prior location location, the prior location location can be "corrected", thereby Makes the final positioning result of the unmanned vehicle less error.
[0039] Step 240: Determine the position of the unmanned vehicle in the laser point cloud reflection value map based on the first matching probability.
[0040] For example, if the laser point cloud projection data may have a first matching probability with a certain area in the predetermined range of the laser point cloud reflection value map than the first matching probability with other areas in the predetermined range of the laser point cloud reflection value map In some optional implementation manners, the area with a higher first matching probability may be used as the position of the current unmanned vehicle in the laser point cloud reflection value map.
[0041] Or, in some other optional implementation manners, the first matching probability of each area in the predetermined range of the laser point cloud projection data and the laser point cloud reflection value map may be processed to a certain extent, and the processing result may be further determined Find the current position of the unmanned vehicle in the laser point cloud reflection value map.
[0042] The unmanned vehicle positioning method based on laser point cloud reflection value matching of this embodiment converts the first laser point cloud reflection value data matching the current position of the unmanned vehicle into laser point cloud projection data, and projects the laser point cloud The data is matched with each area of ​​a predetermined range of the laser point cloud reflection value map, and the position of the unmanned vehicle in the laser point cloud reflection value map is determined based on the matching probability, so that the a priori positioning position can be corrected to realize Precise positioning of unmanned vehicles.
[0043] In some optional implementation manners, the laser point cloud reflection value map in the unmanned vehicle positioning method based on laser point cloud reflection value matching of this embodiment can be obtained by, for example, image 3 The process 300 shown is generated.
[0044] Specifically, in step 310, the surface of the earth is divided into M×N map regions in the ground plane of the world coordinate system, where the map regions may have the same size and shape, for example.
[0045] In some optional implementation manners, the world coordinate system may adopt, for example, a UTM coordinate system (Universal Transverse Mercator System).
[0046] In step 320, the map area is further divided into m×n map grids, where the map grids have the same size and shape.
[0047] Since the map area divided in step 310 may cover a large area, in the process of generating the laser point cloud reflection value map, the number of laser points in the map area may be of a considerable order of magnitude, resulting in the calculation amount of positioning processing Larger. In addition, when the map area may cover a larger area, the accuracy of the positioning result obtained based on the map is low. Therefore, in this step 320, the map area can be further subdivided, thereby reducing the calculation amount of the positioning processing and improving the positioning accuracy of the positioning result.
[0048] Step 330: Collect the second laser point cloud reflection value data corresponding to the location position of the map grid, where the second laser point cloud reflection value data includes the second coordinates of each laser point in the world coordinate system and the second coordinates The laser reflection intensity value corresponding to each laser point in the laser point cloud reflection value data.
[0049] For example, suppose the abscissa of a certain map grid x∈[x a ,x b ], and the ordinate y ∈ (y c ,y d ]. In this step, the laser reflection intensity value of each laser point whose world coordinate is within this range can be collected, and the laser reflection intensity value of the laser point within the coordinate range of each map grid on the earth's surface can be collected in a similar manner.
[0050] Step 340: Store corresponding map data in each map grid. Wherein, the map data includes the mean value of the laser reflection intensity value of each laser point in the positioning position corresponding to the map grid, the variance of the laser reflection intensity value of each laser point in the positioning position corresponding to the map grid, and The number of laser points in the positioning position corresponding to the map grid.
[0051] Continue to return figure 2 , In some alternative implementations, figure 2 In step 210, the first coordinate of the laser point may be the coordinate of each laser point in the first laser point cloud reflection value data in the vehicle coordinate system of the unmanned vehicle.
[0052] Among these alternative implementations, figure 2 The conversion of the first laser point cloud reflection value data into the laser point cloud projection data in the ground plane in step 220 may further include:
[0053] Step 221: Convert the first laser point cloud reflection value data into third laser point cloud reflection value data.
[0054] Here, the third laser point cloud reflection value data may include, for example, the third coordinates of each laser point and the laser reflection intensity value corresponding to each laser point in the first laser point cloud reflection value data, where the third coordinate is the first The coordinates of each laser point in the laser point cloud reflection value data in the world coordinate system. The third coordinate X’ can be:
[0055] X’=(x’,y’,z’) T =RX+T (1)
[0056] Among them, R is the rotation matrix converted from the vehicle coordinate system of the unmanned vehicle to the world coordinate system, X=(x,y,z) T Is the first coordinate of each laser point in the first laser point cloud reflection value data, and T is the translation matrix converted from the vehicle coordinate system of the unmanned vehicle to the world coordinate system.
[0057] Step 222: Project the third laser point cloud reflection value data to the ground plane to generate laser point cloud projection data.
[0058] The laser point cloud projection data may include the projection coordinates of each laser point in the first laser point cloud reflection value data, the average value of the laser reflection intensity value of each laser point in each projection grid, and each laser point in each projection grid The variance of the laser reflection intensity value and the number of each laser point in each projection grid.
[0059] Among them, the projection coordinate X" of each laser point in the first laser point cloud reflection value data satisfies:
[0060] X”=(x”,y”) T =SX’ (2)
[0061] Among them, S is the projection matrix, and satisfies:
[0062] S = 1 0 0 0 1 0 .
[0063] Through formula (1) and formula (2), the laser point cloud data in the three-dimensional space collected based on the unmanned vehicle coordinate system (ie the first laser point cloud reflection value data) can be converted into a ground plane based on the world coordinate system Laser point cloud data (ie, laser point cloud projection data) within.
[0064] Here, each projection grid may have the same size and shape as the map grid. For example, the projection grid and the map grid may be congruent rectangles.
[0065] In some alternative implementations, figure 2 Step 230 in can be further implemented by the following process.
[0066] Step 231, the center projection grid O(x,y) of the laser point cloud projection data and the laser point cloud reflection value map are compared with the prior positioning position O’(x o ,y o ) The corresponding map grids overlap, where the central projection grid O(x,y) is the projection grid representing the unmanned vehicle body in the laser point cloud projection data.
[0067] In some application scenarios, the laser point cloud reflection value collection device installed on the unmanned vehicle can collect the laser point cloud reflection value data around the unmanned vehicle with a predetermined radius. The laser point cloud reflection value collection device collects the laser The point cloud reflection value data is located in a sphere with the unmanned vehicle as the center and the predetermined radius as the radius. In these application scenarios, part of the data in the sphere can be intercepted for subsequent matching and positioning. For example, construct a cuboid or cube of the sphere, and use the laser point cloud reflection value of the laser point falling into the cuboid or cube as the laser point cloud reflection value data used for positioning (that is, the first laser point cloud reflection value data). Therefore, in these application scenarios, in the finally generated laser point cloud projection data, the central projection grid O(x,y) representing the unmanned vehicle falls exactly into the geometric center of the entire projection range.
[0068] Step 232: Determine the first matching probability between the projection range of the laser point cloud projection data and the corresponding map range.
[0069] Assuming that the projection range formed by the laser point cloud projection data includes a 5×5 projection grid, then the map range corresponding to the projection range also includes a 5×5 map grid.
[0070] In some application scenarios, the first matching probability between the projection range and the corresponding map range can be determined by the following formula (3), for example:
[0071] P ( x , y ) = α - X x i = x 1 x m X y j = y 1 y n | μ x i , y j m - μ x i , y j r | ( σ x i , y j m + σ x i , y j r ) N x i , y j r 2 σ x i , y j m σ x i , y j r X x i = x 1 x m X y j = y 1 y n N x i , y j r - - - ( 3 )
[0072] Among them, (x, y) is the world coordinate of the center projection grid, (x i ,y j ) Is the world coordinate of each projection grid within the projection range of the laser point cloud projection data, α is a preset constant parameter, Is the world coordinate as (x i ,y j ) The mean value of the laser reflection intensity values ​​of the laser points in the map grid, Is the world coordinate as (x i ,y j ) The mean value of the laser reflection intensity value of the laser point in the projection grid, Is the world coordinate as (x i ,y j ) The variance of the laser reflection intensity value of the laser point in the map grid, Is the world coordinate as (x i ,y j ) The variance of the laser reflection intensity value of the laser point in the projection grid, Is the world coordinate as (x i ,y j ) The number of laser points in the projection grid. x 1 Is the abscissa value of the map grid with the smallest abscissa in the map range, and x m Is the abscissa value of the map grid with the largest abscissa in the map range; correspondingly, y 1 Is the ordinate value of the map grid with the smallest abscissa in the map range, and y n It is the ordinate value of the map grid with the largest ordinate in the map range.
[0073] In other words, in formula (3), P(x,y) is based on α, and The power function of the power.
[0074] Step 233: Move the central projection grid O(x, y) by a predetermined offset k, and respectively determine the first matching probability of the laser point cloud projection data corresponding to the current central projection grid O(x, y).
[0075] Here, k can be understood as a priori positioning position O’(x o ,y o ) The corresponding map grid is the initial position, which is formed by sequentially shifting 1~k map grids along the positive and negative directions of the x-axis and 1~k map grids along the positive and negative directions of the y-axis. (2k+1) 2 Map extents. Such as Figure 4 As shown, the area shown by the dashed frame 420 is the predetermined range in the map formed when the projection range 410 formed by the laser point cloud projection data includes a 5×5 projection grid and the offset k=2.
[0076] When the projection range moves within the predetermined range, the first matching probability of the laser point cloud projection data corresponding to the current central projection grid O(x, y) can be determined based on the above formula (3). In other words, the first matching probability between the projection range and the corresponding map range is determined respectively. To Figure 4 As an example, when the projection range 410 moves in the predetermined range 420, (2k+1) can be obtained accordingly 2 = 25 first match probabilities.
[0077] Among these alternative implementations, figure 2 Step 240 in may further include:
[0078] Step 241: Determine the position of the unmanned vehicle in the laser point cloud reflection value map based on the weighted average of the first matching probabilities.
[0079] Specifically, for example, the position of the unmanned vehicle in the laser point cloud reflection value map can be determined by the following formula (4)
[0080] x ‾ = X i = - k k X j = - k k P ′ ( x 0 + i , y 0 + j ) α · ( x 0 + i ) X i = - k k X j = - k k P ′ ( x 0 + i , y 0 + j ) α y ‾ = X i = - k k X j = - k k P ′ ( x 0 + i , y 0 + j ) α · ( y 0 + j ) X i = - k k X j = - k k P ′ ( x 0 + i , y 0 + j ) α - - - ( 4 )
[0081] Where (x 0 ,y 0 ) Is the world coordinate of the map grid where the a priori positioning position is located, P(x 0 +i,y 0 +j) is the center projection grid in the map (x 0 +i,y 0 +j) The first matching probability between the projection range and the corresponding map range when the coordinates are used.
[0082] In addition, in some optional implementations, after the first matching probability P of the laser point cloud projection data within the predetermined range of the laser point cloud reflection value map is determined by formula (3), the following formula (5 ) To update the first matching probability P to obtain the updated first matching probability P':
[0083] P , ( x , y ) = η P ( x , y ) P ‾ ( x , y ) - - - ( 5 )
[0084] among them, In order to predict the probability of the unmanned vehicle currently appearing in the world coordinate (x, y) based on the last positioning position, η is a preset normalization coefficient.
[0085] In these alternative implementations, since the first matching probability is updated, correspondingly, formula (4) can be transformed into the following formula (6):
[0086] x ‾ = X i = - k k X j = - k k P ′ ( x 0 + i , y 0 + j ) α · ( x 0 + i ) X i = - k k X j = - k k P ′ ( x 0 + i , y 0 + j ) α y ‾ = X i = - k k X j = - k k P ′ ( x 0 + i , y 0 + j ) α · ( y 0 + j ) X i = - k k X j = - k k P ′ ( x 0 + i , y 0 + j ) α - - - ( 6 )
[0087] Where P’(x 0 +i,y 0 +j) is the center projection grid in the map (x 0 +i,y 0 +j) In the coordinate, the first matching probability between the projection range and the corresponding map range updated by formula (5).
[0088] In some optional implementations, in the unmanned vehicle positioning method based on laser point cloud reflection value matching of this embodiment, the position of the unmanned vehicle in the laser point cloud reflection value map is determined based on the first matching probability in step 240 It can also be achieved through the following process:
[0089] In step 242, the map grid within a predetermined range is further subdivided, so that the map grid forms p×q sub-grids.
[0090] Step 243: Determine the position of the unmanned vehicle in the laser point cloud reflection value map by using the following formula (7)
[0091] x ‾ = X x = x o - k x o + k X y = y o - k y o + k η ( x ) P ′ ′ ( x , y ) α · x X x = x o - k x o + k X y = y o - k y o + k η ( x ) P ′ ′ ( x , y ) α y ‾ = X x = x o - k x o + k X y = y o - k y o + k η ( y ) P ′ ′ ( x , y ) α · y X x = x o - k x o + k X y = y o - k y o + k η ( y ) P ′ ′ ( x , y ) α - - - ( 7 )
[0092] among them:
[0093] (x 0 ,y 0 ) Is the world coordinate of the map grid where the a priori positioning position is located, and x is in [x o -k,x o +k] The step length of change in the range is y is in [y o -k,y o +k] The step length of change in the range is And there are:
[0094] η ( x ) = 1 ( x - x o ) β η ( y ) = 1 ( y - y o ) β ;
[0095] β is a preset constant parameter, and P”(x,y) is the probability obtained by bilinear interpolation of the first matching probability when the grid of (x,y) location map is used as the center projection grid. Here , The first matching probability may be the first matching probability determined by formula (3), or the first matching probability may also be the first matching probability updated by formula (5).
[0096] Further reference Figure 5 As an implementation of the methods shown in the above figures, this application provides an embodiment of an unmanned vehicle positioning device based on laser point cloud reflection value matching. figure 2 The method embodiment shown corresponds.
[0097] Such as Figure 5 As shown, the unmanned vehicle positioning device based on laser point cloud reflection value matching of this embodiment may include an acquisition module 510, a conversion module 520, a matching probability determination module 530, and a position determination module 540.
[0098] Wherein, the acquisition module 510 may be configured to acquire the first laser point cloud reflection value data matching the current position of the unmanned vehicle. The first laser point cloud reflection value data includes the first coordinates of each laser point and the first laser point cloud reflection value data. The laser reflection intensity value corresponding to each laser point in the point cloud reflection value data.
[0099] The conversion module 520 may be configured to convert the first laser point cloud reflection value data into laser point cloud projection data in the ground plane.
[0100] The matching probability determination module 530 may be configured to use the predetermined a priori positioning position in the laser point cloud reflection value map as the initial position to determine the first position of the laser point cloud projection data within the predetermined range of the laser point cloud reflection value map. One match probability.
[0101] The position determination module 540 may be configured to determine the position of the unmanned vehicle in the laser point cloud reflection value map based on the first matching probability.
[0102] In some optional implementations, the laser point cloud reflection value map may include, for example, M×N map regions formed by dividing the surface of the earth in the ground plane of the world coordinate system, wherein the map regions have the same size and shape . Each map area may further include m×n map grids, where each map grid has the same size and shape. The laser point cloud reflection value map can also include the average value of the laser reflection intensity value of each laser point corresponding to the location location corresponding to the map grid, and the laser reflection intensity value of each laser point within the location location corresponding to the map grid The variance of and the number of laser points in the location corresponding to the map grid.
[0103] In some optional implementation manners, the first coordinates of each laser point may be the coordinates of each laser point in the first laser point cloud reflection value data in the vehicle coordinate system of the unmanned vehicle.
[0104] The conversion module 520 may be further configured to convert the first laser point cloud reflection value data into third laser point cloud reflection value data, and project the third laser point cloud reflection value data to the ground plane to generate a laser point cloud Projection data. The third laser point cloud reflection value data includes the third coordinates of each laser point and the laser reflection intensity value corresponding to each laser point in the first laser point cloud reflection value data, where the third coordinate is the first laser point cloud The coordinates of each laser point in the reflection value data in the world coordinate system.
[0105] In some alternative implementations, the third coordinate X’ is: X’=(x’,y’,z’) T =RX+T.
[0106] Among them, R is the rotation matrix converted from the vehicle coordinate system of the unmanned vehicle to the world coordinate system, X=(x,y,z) T Is the first coordinate of each laser point in the first laser point cloud reflection value data, and T is the translation matrix converted from the vehicle coordinate system of the unmanned vehicle to the world coordinate system.
[0107] In some optional implementations, the laser point cloud projection data includes the projection coordinates of each laser point in the first laser point cloud reflection value data, the average value of the laser reflection intensity value of each laser point in each projection grid, and each The variance of the laser reflection intensity value of each laser point in the projection grid and the number of each laser point in each projection grid.
[0108] Wherein, the projection coordinate X” of each laser point in the first laser point cloud reflection value data satisfies: X”=(x”,y”) T =SX’.
[0109] S is the projection matrix and satisfies:
[0110] S = 1 0 0 0 1 0 ;
[0111] Each projection grid has the same size and shape as the map grid.
[0112] In some optional implementations, the matching probability determination module 530 may be further configured to: combine the center projection grid O(x,y) of the laser point cloud projection data and the laser point cloud reflection value map with a priori positioning Position O'(x o ,y o ) The corresponding map grids overlap, where the central projection grid O(x,y) is the projection grid representing the unmanned vehicle body in the laser point cloud projection data; determine the projection range of the laser point cloud projection data and its correspondence The first matching probability of the map range; move the central projection grid O(x,y) by a predetermined offset k, and respectively determine the laser point cloud projection data corresponding to the current central projection grid O(x,y) The first match probability.
[0113] The position determination module 540 may be further configured to determine the position of the unmanned vehicle in the laser point cloud reflection value map based on the weighted average of the first matching probabilities.
[0114] In some alternative implementations, the first matching probability P(x,y) corresponding to any central projection grid O(x,y) is:
[0115] P ( x , y ) = α - X x i = x 1 x m X y j = y 1 y n | μ x i , y j m - μ x i , y j r | ( σ x i , y j m + σ x i , y j r ) N x i , y j r 2 σ x i , y j m σ x i , y j r X x i = x 1 x m X y j = y 1 y n N x i , y j r ;
[0116] Among them, (x, y) is the world coordinate of the center projection grid, (x i ,y j ) Is the world coordinate of each projection grid within the projection range of the laser point cloud projection data, α is a preset constant parameter, Is the world coordinate as (x i ,y j ) The mean value of the laser reflection intensity values ​​of the laser points in the map grid, Is the world coordinate as (x i ,y j ) The mean value of the laser reflection intensity value of the laser point in the projection grid, Is the world coordinate as (x i ,y j ) The variance of the laser reflection intensity value of the laser point in the map grid, Is the world coordinate as (x i ,y j ) The variance of the laser reflection intensity value of the laser point in the projection grid, Is the world coordinate as (x i ,y j ) The number of laser points in the projection grid.
[0117] In some optional implementation manners, the matching probability determination module 530 may be further configured to: update the first matching probability based on the previous positioning position, and the updated first matching probability P'(x,y) is:
[0118] P , ( x , y ) = η P ( x , y ) P ‾ ( x , y ) ;
[0119] among them, In order to predict the probability of the unmanned vehicle currently appearing in the world coordinate (x, y) based on the last positioning position, η is a preset normalization coefficient.
[0120] In some alternative implementations, the position of the unmanned vehicle in the laser point cloud reflection value map determined by the position determination module 540 Can be:
[0121] Where (x 0 ,y 0 ) Is the world coordinate of the map grid where the a priori positioning position is located.
[0122] In other optional implementation manners, the location determining module 540 may be further configured to:
[0123] Further subdivide the map grid within a predetermined range so that the map grid forms p×q sub-grids;
[0124] The position of the unmanned vehicle in the laser point cloud reflection value map for
[0125] x ‾ = X x = x o - k x o + k X y = y o - k y o + k η ( x ) P ′ ′ ( x , y ) α · x X x = x o - k x o + k X y = y o - k y o + k η ( x ) P ′ ′ ( x , y ) α y ‾ = X x = x o - k x o + k X y = y o - k y o + k η ( y ) P ′ ′ ( x , y ) α · y X x = x o - k x o + k X y = y o - k y o + k η ( y ) P ′ ′ ( x , y ) α ;
[0126] among them:
[0127] (x 0 ,y 0 ) Is the world coordinate of the map grid where the a priori positioning position is located, and x is in [x o -k,x o +k] The step length of change in the range is y is in [y o -k,y o +k] The step length of change in the range is
[0128] η ( x ) = 1 ( x - x o ) β η ( y ) = 1 ( y - y o ) β ;
[0129] β is a preset constant parameter, and P"(x,y) is the probability obtained by bilinear interpolation of the first matching probability when the (x,y) location map grid is used as the center projection grid.
[0130] Those skilled in the art can understand that the above-mentioned unmanned vehicle positioning device 500 based on laser point cloud reflection value matching also includes some other well-known structures, such as a processor, a memory, etc., in order to unnecessarily obscure the embodiments of the present disclosure, these well-known Structure in Figure 5 Not shown in.
[0131] See Image 6 As shown, this is a schematic structural diagram 600 of an embodiment of an unmanned vehicle of this application.
[0132] Such as Image 6 As shown, the unmanned vehicle may include a point cloud reflection value data collection device 610, a storage device 620, and a processor 630.
[0133] Among them, the point cloud reflection value data collection device 610 can be used to collect the laser point cloud reflection value data of the current position of the unmanned vehicle, where the laser point cloud reflection value data includes the coordinates of each laser point and the laser reflection intensity corresponding to each laser point value.
[0134] The storage device 620 may be used to store a laser point cloud reflection value map.
[0135] The processor 630 can be used to project the laser point cloud reflection value data to the ground plane to generate laser point cloud projection data; use the predetermined a priori positioning position in the laser point cloud reflection value map as the initial position to determine the laser point cloud The projection data has a first matching probability within a predetermined range of the laser point cloud reflection value map; and the position of the unmanned vehicle in the laser point cloud reflection value map is determined based on the first matching probability.
[0136] In some optional implementation manners, the laser point cloud reflection value map may include M×N map regions formed by dividing the surface of the earth in the ground plane of the world coordinate system, wherein the map regions have the same size and shape. Each map area may further include m×n map grids, where the map grids have the same size and shape. The laser point cloud reflection value map may also include the average value of the laser reflection intensity value of each laser point corresponding to the location position of the map grid, and the variance of the laser reflection intensity value of each laser point corresponding to the location location of the map grid. And the number of laser points corresponding to the location of the map grid.
[0137] In some optional implementation manners, the first coordinates of each laser point may be the coordinates of each laser point in the first laser point cloud reflection value data in the vehicle coordinate system of the unmanned vehicle.
[0138] The processor 630 may be further configured to convert the first laser point cloud reflection value data into third laser point cloud reflection value data, and project the third laser point cloud reflection value data to the ground plane to generate a laser point cloud projection data. The third laser point cloud reflection value data includes the third coordinates of each laser point and the laser reflection intensity value corresponding to each laser point in the first laser point cloud reflection value data, where the third coordinate is the first laser point cloud The coordinates of each laser point in the reflection value data in the world coordinate system.
[0139] In some alternative implementations, the third coordinate X’ is: X’=(x’,y’,z’) T =RX+T.
[0140] Among them, R is the rotation matrix converted from the vehicle coordinate system of the unmanned vehicle to the world coordinate system, X=(x,y,z) T Is the first coordinate of each laser point in the first laser point cloud reflection value data, and T is the translation matrix converted from the vehicle coordinate system of the unmanned vehicle to the world coordinate system.
[0141] In some optional implementations, the laser point cloud projection data includes the projection coordinates of each laser point in the first laser point cloud reflection value data, the average value of the laser reflection intensity value of each laser point in each projection grid, and each The variance of the laser reflection intensity value of each laser point in the projection grid and the number of each laser point in each projection grid.
[0142] Wherein, the projection coordinate X” of each laser point in the first laser point cloud reflection value data satisfies: X”=(x”,y”) T =SX’.
[0143] S is the projection matrix and satisfies:
[0144] S = 1 0 0 0 1 0 ;
[0145] Each projection grid has the same size and shape as the map grid.
[0146] In some optional implementations, the processor 630 may be further used to: combine the center projection grid O(x,y) of the laser point cloud projection data with the laser point cloud reflection value map and the prior positioning position O' (x o ,y o ) The corresponding map grids overlap, where the central projection grid O(x,y) is the projection grid representing the unmanned vehicle body in the laser point cloud projection data; determine the projection range of the laser point cloud projection data and its correspondence The first matching probability of the map range; move the central projection grid O(x,y) by a predetermined offset k, and respectively determine the laser point cloud projection data corresponding to the current central projection grid O(x,y) The first match probability.
[0147] The processor 630 may be further configured to determine the position of the unmanned vehicle in the laser point cloud reflection value map based on the weighted average of the first matching probabilities.
[0148] In some alternative implementations, the first matching probability P(x,y) corresponding to any central projection grid O(x,y) is:
[0149] P ( x , y ) = α - X x i = x 1 x m X y j = y 1 y n | μ x i , y j m - μ x i , y j r | ( σ x i , y j m + σ x i , y j r ) N x i , y j r 2 σ x i , y j m σ x i , y j r X x i = x 1 x m X y j = y 1 y n N x i , y j r ;
[0150] Among them, (x, y) is the world coordinate of the center projection grid, (x i ,y j ) Is the world coordinate of each projection grid within the projection range of the laser point cloud projection data, α is a preset constant parameter, Is the world coordinate as (x i ,y j ) The mean value of the laser reflection intensity values ​​of the laser points in the map grid, Is the world coordinate as (x i ,y j ) The mean value of the laser reflection intensity value of the laser point in the projection grid, Is the world coordinate as (x i ,y j ) The variance of the laser reflection intensity value of the laser point in the map grid, Is the world coordinate as (x i ,y j ) The variance of the laser reflection intensity value of the laser point in the projection grid, Is the world coordinate as (x i ,y j ) The number of laser points in the projection grid.
[0151] In some optional implementation manners, the processor 630 may be further configured to: update the first matching probability based on the previous positioning position, and the updated first matching probability P'(x,y) is:
[0152] P , ( x , y ) = η P ( x , y ) P ‾ ( x , y ) ;
[0153] among them, In order to predict the probability of the unmanned vehicle currently appearing in the world coordinate (x, y) based on the last positioning position, η is a preset normalization coefficient.
[0154] In some optional implementations, the position of the unmanned vehicle in the laser point cloud reflection value map determined by the processor 630 Can be:
[0155] Where (x 0 ,y 0 ) Is the world coordinate of the map grid where the a priori positioning position is located.
[0156] In other optional implementation manners, the processor 630 may be further used to:
[0157] Further subdivide the map grid within a predetermined range so that the map grid forms p×q sub-grids;
[0158] The position of the unmanned vehicle in the laser point cloud reflection value map for
[0159] x ‾ = X x = x o - k x o + k X y = y o - k y o + k η ( x ) P ′ ′ ( x , y ) α · x X x = x o - k x o + k X y = y o - k y o + k η ( x ) P ′ ′ ( x , y ) α y ‾ = X x = x o - k x o + k X y = y o - k y o + k η ( y ) P ′ ′ ( x , y ) α · y X x = x o - k x o + k X y = y o - k y o + k η ( y ) P ′ ′ ( x , y ) α ;
[0160] among them:
[0161] (x 0 ,y 0 ) Is the world coordinate of the map grid where the a priori positioning position is located, and x is in [x o -k,x o +k] The step length of change in the range is 1 p , y is in [y o -k,y o +k] The step length of change in the range is 1 q ;
[0162] η ( x ) = 1 ( x - x o ) β η ( y ) = 1 ( y - y o ) β ;
[0163] β is a preset constant parameter, and P"(x,y) is the probability obtained by bilinear interpolation of the first matching probability when the (x,y) location map grid is used as the center projection grid.
[0164] See Figure 7 As shown, this is a schematic structural diagram 700 of an embodiment of the unmanned vehicle positioning system of this application.
[0165] The unmanned vehicle positioning system of this embodiment may include an unmanned vehicle 710 and a positioning server 720.
[0166] Among them, the unmanned vehicle 710 may include a point cloud reflection value data collection device 711 and a first communication device 712.
[0167] The point cloud reflection value data collection device 711 can be used to collect the laser point cloud reflection value data of the current position of the unmanned vehicle, where the laser point cloud reflection value data includes the first coordinates of each laser point and the laser reflection intensity corresponding to each laser point value. The first communication device 712 may be used to send laser point cloud reflection value data to the positioning server.
[0168] The positioning server 720 may include a second communication device 721, a memory 722, and a processor 723.
[0169] The second communication device 721 may be used to receive the laser point cloud reflection value data sent by the first communication device 711. The memory 722 can be used to store a laser point cloud reflection value map.
[0170] The processor 723 can be used to project the laser point cloud reflection value data to the ground plane to generate laser point cloud projection data, and use the predetermined a priori positioning position in the laser point cloud reflection value map as the initial position to determine the laser point cloud The projection data has a first matching probability in a predetermined area of ​​the laser point cloud reflection value map, and the positioning result of the unmanned vehicle is determined based on the first matching probability. Among them, the positioning result includes the position information of the unmanned vehicle in the laser point cloud reflection value map.
[0171] In addition, the second communication device 721 is also used to send the positioning result to the first communication device 711.
[0172] In some optional implementation manners, the laser point cloud reflection value map may include M×N map regions formed by dividing the surface of the earth in the ground plane of the world coordinate system, wherein the map regions have the same size and shape. Each map area may further include m×n map grids, where each map grid has the same size and shape. The laser point cloud reflection value map also includes the average value of the laser reflection intensity value of each laser point corresponding to the location location of the map grid, the variance of the laser reflection intensity value of each laser point corresponding to the location location of the map grid, and The number of laser points corresponding to the location of the map grid.
[0173] In some optional implementation manners, the first coordinates of each laser point may be the coordinates of each laser point in the first laser point cloud reflection value data in the vehicle coordinate system of the unmanned vehicle.
[0174] The processor 723 may be further configured to convert the first laser point cloud reflection value data into third laser point cloud reflection value data, and project the third laser point cloud reflection value data to the ground plane to generate a laser point cloud projection data. The third laser point cloud reflection value data includes the third coordinates of each laser point and the laser reflection intensity value corresponding to each laser point in the first laser point cloud reflection value data, where the third coordinate is the first laser point cloud The coordinates of each laser point in the reflection value data in the world coordinate system.
[0175] In some alternative implementations, the third coordinate X’ is: X’=(x’,y’,z’) T =RX+T.
[0176] Among them, R is the rotation matrix converted from the vehicle coordinate system of the unmanned vehicle to the world coordinate system, X=(x,y,z) T Is the first coordinate of each laser point in the first laser point cloud reflection value data, and T is the translation matrix transformed from the vehicle coordinate system of the unmanned vehicle to the world coordinate system.
[0177] In some optional implementations, the laser point cloud projection data includes the projection coordinates of each laser point in the first laser point cloud reflection value data, the average value of the laser reflection intensity value of each laser point in each projection grid, and each The variance of the laser reflection intensity value of each laser point in the projection grid and the number of each laser point in each projection grid.
[0178] Wherein, the projection coordinate X” of each laser point in the first laser point cloud reflection value data satisfies: X”=(x”,y”) T =SX’.
[0179] S is the projection matrix and satisfies:
[0180] S = 1 0 0 0 1 0 ;
[0181] Each projection grid has the same size and shape as the map grid.
[0182] In some optional implementations, the processor 723 may be further used to: combine the center projection grid O(x,y) of the laser point cloud projection data with the laser point cloud reflection value map and the prior positioning position O' (x o ,y o ) The corresponding map grids overlap, where the central projection grid O(x,y) is the projection grid representing the unmanned vehicle body in the laser point cloud projection data; determine the projection range of the laser point cloud projection data and its correspondence The first matching probability of the map range; move the central projection grid O(x,y) by a predetermined offset k, and respectively determine the laser point cloud projection data corresponding to the current central projection grid O(x,y) The first match probability.
[0183] The processor 723 may be further configured to determine the position of the unmanned vehicle in the laser point cloud reflection value map based on the weighted average of the first matching probabilities.
[0184] In some alternative implementations, the first matching probability P(x,y) corresponding to any central projection grid O(x,y) is:
[0185] P ( x , y ) = α - X x i = x 1 x m X y j = y 1 y n | μ x i , y j m - μ x i , y j r | ( σ x i , y j m + σ x i , y j r ) N x i , y j r 2 σ x i , y j m σ x i , y j r X x i = x 1 x m X y j = y 1 y n N x i , y j r ;
[0186] Among them, (x, y) is the world coordinate of the center projection grid, (x i ,y j ) Is the world coordinate of each projection grid within the projection range of the laser point cloud projection data, α is a preset constant parameter, Is the world coordinate as (x i ,y j ) The mean value of the laser reflection intensity value of the laser points in the map grid, Is the world coordinate as (x i ,y j ) The mean value of the laser reflection intensity value of the laser point in the projection grid, Is the world coordinate as (x i ,y j ) The variance of the laser reflection intensity value of the laser point in the map grid, Is the world coordinate as (x i ,y j ) The variance of the laser reflection intensity value of the laser point in the projection grid, Is the world coordinate as (x i ,y j ) The number of laser points in the projection grid.
[0187] In some optional implementation manners, the processor 723 may be further configured to: update the first matching probability based on the previous positioning position, and the updated first matching probability P'(x,y) is:
[0188] P , ( x , y ) = η P ( x , y ) P ‾ ( x , y ) ;
[0189] among them, In order to predict the probability of the unmanned vehicle currently appearing in the world coordinate (x, y) based on the last positioning position, η is a preset normalization coefficient.
[0190] In some optional implementations, the position of the unmanned vehicle in the laser point cloud reflection value map determined by the processor 723 Can be:
[0191] (x 0 ,y 0 ) Is the world coordinate of the map grid where the a priori positioning position is located.
[0192] In other optional implementation manners, the processor 723 may be further used to:
[0193] Further subdivide the map grid within a predetermined range so that the map grid forms p×q sub-grids;
[0194] The position of the unmanned vehicle in the laser point cloud reflection value map for
[0195] x ‾ = X x = x o - k x o + k X y = y o - k y o + k η ( x ) P ′ ′ ( x , y ) α · x X x = x o - k x o + k X y = y o - k y o + k η ( x ) P ′ ′ ( x , y ) α y ‾ = X x = x o - k x o + k X y = y o - k y o + k η ( y ) P ′ ′ ( x , y ) α · y X x = x o - k x o + k X y = y o - k y o + k η ( y ) P ′ ′ ( x , y ) α ;
[0196] among them:
[0197] (x 0 ,y 0 ) Is the world coordinate of the map grid where the a priori positioning position is located, and x is in [x o -k,x o +k] The step length of change in the range is y is in [y o -k,y o +k] The step length of change in the range is
[0198] η ( x ) = 1 ( x - x o ) β η ( y ) = 1 ( y - y o ) β ;
[0199] β is a preset constant parameter, and P"(x,y) is the probability obtained by bilinear interpolation of the first matching probability when the (x,y) location map grid is used as the center projection grid.
[0200] Reference below Figure 8 , Which shows a schematic structural diagram of a computer system 800 suitable for implementing the processor or positioning server of the unmanned vehicle in the embodiments of the present application.
[0201] Such as Figure 8 As shown, the computer system 800 includes a central processing unit (CPU) 801, which can execute various programs according to a program stored in a read only memory (ROM) 802 or a program loaded from a storage portion 808 to a random access memory (RAM) 803 Kind of appropriate actions and processing. In the RAM 803, various programs and data required for the operation of the system 800 are also stored. The CPU 801, ROM 802, and RAM 803 are connected to each other through a bus 804. An input/output (I/O) interface 805 is also connected to the bus 804.
[0202] The following components are connected to the I/O interface 805: an input part 806 including a keyboard, a mouse, etc.; an output part 807 including a cathode ray tube (CRT), a liquid crystal display (LCD), etc., and speakers, etc.; a storage part 808 including a hard disk, etc. ; And a communication section 809 including a network interface card such as a LAN card, a modem, etc. The communication section 809 performs communication processing via a network such as the Internet. The driver 810 is also connected to the I/O interface 805 as needed. A removable medium 811, such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, etc., is installed on the drive 810 as needed, so that the computer program read from it is installed into the storage section 808 as needed.
[0203] In particular, according to an embodiment of the present disclosure, the process described above with reference to the flowchart can be implemented as a computer software program. For example, an embodiment of the present disclosure includes a computer program product, which includes a computer program tangibly embodied on a machine-readable medium, and the computer program includes program code for executing the method shown in the flowchart. In such an embodiment, the computer program may be downloaded and installed from the network through the communication part 809, and/or installed from the removable medium 811.
[0204] The flowcharts and block diagrams in the drawings illustrate the possible implementation of the system architecture, functions, and operations of the system, method, and computer program product according to various embodiments of the present application. In this regard, each block in the flowchart or block diagram can represent a module, program segment, or part of code, and the module, program segment, or part of code contains one or more logic for implementing the specified Function executable instructions. It should also be noted that, in some alternative implementations, the functions marked in the block may also occur in a different order from the order marked in the drawings. For example, two blocks shown in succession can actually be executed substantially in parallel, and they can sometimes be executed in the reverse order, depending on the functions involved. It should also be noted that each block in the block diagram and/or flowchart, and the combination of the blocks in the block diagram and/or flowchart, can be implemented by a dedicated hardware-based system that performs the specified functions or operations Or it can be realized by a combination of dedicated hardware and computer instructions.
[0205] The units involved in the embodiments described in the present application can be implemented in software or hardware. The described unit may also be provided in the processor. For example, it may be described as: a processor includes an acquisition module, a conversion module, a matching probability determination module, and a position determination module. Among them, the names of these modules do not constitute a limitation on the module itself under certain circumstances. For example, the acquisition module can also be described as "a module for acquiring the first laser point cloud reflection value data matching the current position of the unmanned vehicle ".
[0206] As another aspect, the present application also provides a non-volatile computer storage medium, and the non-volatile computer storage medium may be the non-volatile computer storage medium included in the device described in the foregoing embodiment; also It may be a non-volatile computer storage medium that exists alone and is not assembled into the terminal. The above-mentioned non-volatile computer storage medium stores one or more programs. When the one or more programs are executed by a device, the device: obtains the reflection of the first laser point cloud matching the current position of the unmanned vehicle The first laser point cloud reflection value data includes the first coordinates of each laser point and the laser reflection intensity value corresponding to each laser point in the first laser point cloud reflection value data; the first laser point cloud is reflected The value data is converted into the laser point cloud projection data in the ground plane; the predetermined a priori positioning position in the laser point cloud reflection value map is used as the initial position to determine the laser point cloud projection data in the laser point cloud reflection value map A first matching probability within a predetermined range; and determining the position of the unmanned vehicle in the laser point cloud reflection value map based on the first matching probability.
[0207] The above description is only a preferred embodiment of the present application and an explanation of the applied technical principles. Those skilled in the art should understand that the scope of the invention involved in this application is not limited to the technical solution formed by the specific combination of the above technical features, and should also cover the above technical features without departing from the inventive concept. Other technical solutions formed by any combination of its equivalent features. For example, the above-mentioned features and the technical features disclosed in this application (but not limited to) with similar functions are mutually replaced to form a technical solution.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Targeted advertising by context of media content

InactiveUS20110179445A1Precise positioningWeight increaseElectronic editing digitised analogue information signalsUsing detectable carrier informationTargeted advertisingMedia content
Owner:TELEFON AB LM ERICSSON (PUBL)

Portable stand-alone device, particularly suitable for use in surgery, micro-component handling and the like

ActiveUS20140371761A1Minimal hand movementPrecise positioningMicromanipulatorDiagnosticsRotational degrees of freedomActuator
Owner:JUANPERA JESUS HERNANDEZ

Vehicle seat

Owner:TOYOTA BOSHOKU KK

Classification and recommendation of technical efficacy words

Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products