Coastal monitoring and defense decision-making system based on multi-dimensional space fusion and method thereof

A technology of spatial fusion and decision-making method, applied in multi-dimensional databases, structured data retrieval, resources, etc., can solve problems such as inability to carry out collaborative defense, and achieve the effect of improving accuracy

Inactive Publication Date: 2016-06-08
DALIAN LANDSEA MARITECH
3 Cites 37 Cited by

AI-Extracted Technical Summary

Problems solved by technology

[0003] The purpose of the present invention is to overcome the defects of the prior art, provide a coastal monitoring and defense decision-making system and method based o...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Method used

As shown in Figure 1, the coast monitoring and defense decision-making system based on multi-dimensional space fusion of the present invention includes information fusion subsystem 11, situation assessment subsystem 12, and security defense subsystem 13, and information fusion subsystem 11 utilizes space Based on the realization of application requirements such as integrated acquisition, transmission, processing, network sharing and application services of multi-dimensional and multi-source spatial information, cooperate with multiple platforms to enhance event processing capabilities, and combine the advantages of various networks and systems in airspace and sea areas to realize Complementary functions expand the range of events that can be handled. The information fusion subsystem 11 accesses space perception information such as sea situation (sea situation center, VTS, AIS, and sea radar, etc.), air situation (ATC, satellite, aircraft, and aircraft, etc.), and other (hydrometeorological, CCTV, etc.) , to carry out fusion processing such as spatio-temporal calibration, error correction, data analysis and filtering, parameter matching and correlation, state assessment and target recognition, so as to realize comprehensive processing of multi-dimensional, massive and real-time dynamic information. The situation assessment subsystem 12 conducts a preliminary assessment based on the dynamic fusion information of multiple platforms, and conducts an in-depth assessment of targets exceeding the warning value. Both the preliminary assessment and the in-depth assessment will display the situation assessment results for the commanders to judge. Realized by the neural network method, it has the following advantages: memory and association functions, good predictive ability; parallel computing can be realized, the amount of information that can be processed is large, and the decision-making speed is fast; high reliability, in the case of some neuron failure Under this condition, the decision-making system can still continue to work normally. The detection capability of multiple platforms is higher than that of a single platform. When a certain platform discovers a threat target, it will notify other platforms in the system. The sensors of each platform can no longer rely on their own detection equipment, but can share the findings of other platforms through system information. Thereby improving the concealment and effectiveness of security defense and military decision-making. The security defense subsystem 13 conducts threat estimation. On the basis of situation assessment, it estimates and analyzes the threat degree of the target in a quantitative form according to the target's attributes of the enemy and the enemy, its position, speed, course, type, quantity, etc. The target of the threat warning adjustment is alerted. According to the set threat judgment criteria, all known targets are screened, and different distribution functions are selected for different situation elements (distance, speed, heading, etc.) to quantify, and at the same time, appropriate weights are selected for weighted sum calculation The threat value is used to analyze and judge its threat degree, and the weight index can be determined after evaluation by DELPHI method (expert opinion method). According to the threat assessment results, the threat level is judged, generally divided into 5-10 levels, and different alarm instructions are issued according to the threat level.
Carry out interception decision-making after the threat assessment to target, mainly comprise: carry out interception adaptability inspection, as stopping shooting, interception prohibition, target identification is enemy or unknown target etc.; Judgment target characteristic, according to the motion parameter of target and Navigation characteristics, judging whether the target meets the requirements of the fire unit; predicting the encounter point of ...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention relates to a coastal monitoring and defense decision-making system based on multi-dimensional space fusion and a method thereof. The system comprises an information fusion subsystem which is used for performing data fusion on information in the sea and information in the airspace so that sea target information and airspace target information are obtained; a situation assessment subsystem which is connected with the information fusion subsystem and used for performing target situation assessment on the sea target information and the airspace target information so that target threat probability is obtained; and a security defense subsystem which is connected with the situation assessment subsystem and used for performing division of target threat levels according to the target threat probability and performing matching of corresponding target defense schemes for the target threat levels. Target identification accuracy in the sea and the airspace can be enhanced by utilizing data fusion so that accuracy of target state and feature estimation can be enhanced and timely complete evaluation of the target situation and the threat level can be ensured.

Application Domain

Technology Topic

Threat levelDecision-making +8

Image

  • Coastal monitoring and defense decision-making system based on multi-dimensional space fusion and method thereof
  • Coastal monitoring and defense decision-making system based on multi-dimensional space fusion and method thereof
  • Coastal monitoring and defense decision-making system based on multi-dimensional space fusion and method thereof

Examples

  • Experimental program(1)

Example Embodiment

[0063] The present invention will be further described below in conjunction with the drawings and specific embodiments.
[0064] The present invention provides a coastal monitoring and defense decision-making system based on multi-dimensional space fusion, which integrates and comprehensively processes the monitoring information and data of multi-platform and multi-sensor in coastal waters and airspace to provide situation display and threat assessment, which is useful for maintaining the country. Sovereignty, national security, military defense, ocean development, ocean management, etc. provide support for decision-making. The system of the present invention mainly includes an information fusion subsystem, a situation assessment subsystem, and a security defense subsystem. The information fusion subsystem uses the basis of space to realize the integrated acquisition, transmission, processing, network sharing and sharing of multi-dimensional and multi-source spatial information. Application services and other application requirements; the situation assessment subsystem conducts a preliminary assessment based on the dynamic fusion information of multiple platforms, and automatically evaluates and predicts the target whose assessment value exceeds the warning value; the security defense subsystem is based on the situation assessment and based on the target The attributes of friend and foe and their position, speed, heading, type, quantity and other information are used to estimate and analyze the threat level of the target in a quantitative form, and to alert and intercept targets that exceed the threat warning conditions. The following describes the coastal monitoring and defense decision-making system and method based on multi-dimensional space integration of the present invention in conjunction with the accompanying drawings.
[0065] See figure 1 , Showing the system diagram of the coastal monitoring and defense decision-making system based on multi-dimensional space integration of the present invention. Combine below figure 1 Describes the coastal monitoring and defense decision-making system based on multi-dimensional space integration of the present invention.
[0066] Such as figure 1 As shown, the coastal monitoring and defense decision-making system based on multi-dimensional space fusion of the present invention includes an information fusion subsystem 11, a situation assessment subsystem 12, and a security defense subsystem 13. The information fusion subsystem 11 utilizes space foundations to realize multi-dimensional and multi-source The integration of spatial information acquisition, transmission, processing, networked sharing and application services and other application requirements, coordinated with multiple platforms to enhance event processing capabilities, combined with the advantages of various networks and systems in the air and sea areas, to achieve functional complementarity and expand processing capabilities The scope of the event. The information fusion subsystem 11 connects to sea situation (sea situation center, VTS, AIS, and sea radar, etc.), air situation (ATC, satellites, aircraft and aircraft, etc.), and other (hydrometeorological, CCTV, etc.) space sensing information , Carry out spatio-temporal calibration, error correction, data analysis and filtering, parameter matching and correlation, state evaluation and target recognition and other fusion processing, so as to realize the comprehensive processing of multi-dimensional, massive and real-time dynamic information. The situation assessment subsystem 12 performs a preliminary assessment based on the dynamic fusion information of multiple platforms, and conducts an in-depth assessment of targets that exceed the warning value. Both the preliminary assessment and the in-depth assessment will display the situation assessment results for the commanders to judge and assess the situation. It is realized by neural network method and has the following advantages: memory and association function, with good predictive ability; parallel calculation can be realized, large amount of information can be processed, fast decision-making speed; high reliability, in the case of some neuron failure The decision-making system can still continue to work normally. The detection capability of multiple platforms is higher than that of a single platform. When a certain platform finds a threat target, it will notify other platforms in the system. The sensors of each platform can no longer rely on its own detection equipment, but can share the findings of other platforms through system information. Thereby improving the concealment and effectiveness of security defense and military decision-making. The security defense subsystem 13 conducts threat estimation. Based on the situation assessment, the target’s threat level is estimated and analyzed in a quantitative manner based on the target’s friend or foe attributes and its position, speed, heading, type, quantity and other information. Threat warning and adjustment targets are alerted. According to the set threat judgment criteria, filter all known targets, select different distribution functions for different situation elements (distance, speed, heading, etc.) to quantify, select appropriate weights for them, and perform weighted sum calculations Threat value is used to analyze and judge the degree of threat, and the weight index can be determined after evaluation by the DELPHI method (expert opinion method). According to the threat assessment results, the threat level is judged, which is generally divided into 5-10 levels, and different warning instructions are issued according to the threat level.
[0067] The information fusion subsystem 11 includes a sea area unit 111, an air space unit 112, and a fusion operation unit 113. Both the sea area unit 111 and the air space unit 112 are connected to the fusion operation unit 113.
[0068] The sea area unit 111 is connected to the sea information center, VTS (Vessel Traffic Services), AIS (Automatic Identification System, ship automatic identification system), and the sea radar. The sea area unit 111 is used to obtain sea information and perform data fusion to obtain the sea area. Target information, sea situation information includes the sea situation center information of the sea situation center, the VTS information of the VTS, the AIS information of the AIS, and the sea radar information of the sea radar. The sea area unit 111 sends the acquired sea information center information, VTS information, AIS information, and sea radar information to the fusion calculation unit 113 for data fusion, and the sea area target information is obtained after data fusion processing.
[0069] The airspace unit 112 communicates with ATC (air traffic control system), satellites, aircraft, and aircraft. The airspace unit 112 is used to obtain air information and perform data fusion to obtain airspace compensation information. The air information includes ATC information and satellites. Satellite information, aircraft aircraft information, and aircraft aircraft information. The airspace unit 112 sends the acquired ATC information, satellite information, aircraft information, and aircraft information to the fusion calculation unit 113 for data fusion, and the airspace target information is obtained after the data fusion processing.
[0070] The fusion operation unit 113 is used to perform data fusion processing on the received information. The received information includes sea or air information. The fusion operation unit 113 includes a modeling module, a preprocessing module, an association module, and a fusion processing module. The model module is used to establish a data fusion model based on the distributed fusion structure for all the information received by the fusion computing unit. All the information is the sea regime information sent by the sea area unit 111 or the air regime information sent by the airspace unit 112; the preprocessing module and modeling Module connection is used to perform spatial and time calibration of all information according to the data fusion model. All information after spatial calibration is transformed into the same coordinate system, and all information after time calibration is unified to the same moment; the association module is connected with the preprocessing module , Used to use the nearest neighbor method to correlate the track of each information calibrated by the preprocessing module to obtain the target included in each information and the track corresponding to the target; the fusion processing module is connected to the correlation module, It is used to use the weighted average method to fuse the same target in all information and fuse the track corresponding to the same target to obtain the fused track. The fusion processing module is also used to associate the fused track with the corresponding target to form a target information. After information fusion processing, the advantages of multiple data sources can be integrated to make up for the shortcomings of a single data source, such as the large amount of information in AIS information, high accuracy of target location data, and the provision of information is not easily affected by terrain, weather and sea conditions , But AIS information is limited to ships with AIS installed, which is not conducive to collision avoidance. The position of AIS is provided by GPS, but under high-noise adjustments, it is easy to cause GPS receivers to lose lock on navigation satellites and lose functions, and there is also ionospheric delay. Time, multipath interference and other issues. For sea radar information, there is a small amount of information, the target attribute cannot be identified, the accuracy of the target measurement is limited, there is a blind area, and it is vulnerable to interference. Through the data fusion of multiple information sources, it is possible to provide more accurate and reliable target data in the airspace and sea areas, and effectively realize the identification, tracking and collision avoidance of targets in the sea and airspace, which is of great significance.
[0071] The spatial calibration in the preprocessing module includes coordinate transformation and spatial registration. The coordinate transformation is to transform a target in a spatial coordinate system into another coordinate system according to the position relationship, and give the relationship between the two coordinate systems. By converting the Cartesian coordinate system and the space polar coordinate to each other.
[0072] Suppose the position coordinate of any point p in the rectangular coordinate system is (x, y, z), and the corresponding position coordinate in the polar coordinate system is Then the mutual conversion relationship between the rectangular coordinate system and the polar coordinate system is:
[0073] or
[0074] Spatial registration is to eliminate the measurement system error of the information source, including dynamic or static estimation of the measurement system error of the information source, and then compensate the target measurement information. The present invention adopts maximum likelihood registration processing in a two-dimensional space. Assuming two sensors a and b, the slope distance and azimuth angle deviation are respectively: Δr a , Δθ a , Δr b , Δθ b. Such as image 3 Shown, r a , Θ a And r b , Θ b They represent the measured values ​​of the slope distance and azimuth angle of sensors a and b, respectively. (x a , Y a ) And (x b , Y b ) Represents the measured value in the global coordinate system, (x sa , Y sa ) And (x sb , Y sb ) Represents the position of the sensor in the global coordinate system.
[0075] The maximum likelihood registration method takes into account the random measurement noise of the sensor. Assume that the noise vector measured by the sensor is:
[0076] v = [ v r a , v θ a , v r b , v θ b , ] T - - - ( 1.2 )
[0077] Indicates the measurement noise of the slope distance and azimuth angle of sensors a and b, respectively, and v obeys Gaussian distribution.
[0078] by image 3 The following basic equation can be derived:
[0079] x a = x s a + r a sinθ a y a = y s a + r a cosθ a x b = x s b + r b sinθ b y b = y s b + r b cosθ b - - - ( 1.3 )
[0080] When considering the measurement noise of the sensor, there can be:
[0081] { r a = r a ′ + Δr a + v r a θ a = θ a ′ + Δθ a + v θ a r b = r b ′ + Δr b + v r b θ b = θ b ′ + Δθ b + v θ b - - - ( 1.4 )
[0082] Where r a ′, θ a ′ And r b ′, θ b ′ Represents the true value of the sensor, Δr a , Δθ a And Δr b , Δθ b Indicates the measurement error of the sensor. Substitute equation (1.4) into equation (1.3), and compare the resulting equation to Δr a , Δθ a , Δr b , Δθ b Perform the first-order Taylor series expansion to get
[0083] x a - x b ≈ sinθ a Δr a - sinθ b Δr b + r a cosθ a Δθ a - r b cosθ b Δθ b y a - y b ≈ cosθ a Δr a - cosθ b Δr b + r a sinθ a Δθ a - r b sinθ b Δθ b - - - ( 1.5 )
[0084] According to the sensor measurement at different moments, that is to say, when k=1, 2,...N, the following pair of equations can be generated from equation (1.5):
[0085] x a , k - x b , k ≈ sinθ a , k Δr a , k - sinθ b , k Δr b , k + r a , k cosθ a , k Δθ a , k - r a , k cosθ b , k Δθ b , k y a , k - y b , k ≈ cosθ a , k Δr a , k - cosθ b , k Δr b , k - r a , k sinθ a , k Δθ a , k + r a , k sinθ b , k Δθ b , k - - - ( 1.6 )
[0086] After N measurements, there are 2N equations. When (N≥2), the 4 unknowns in equation (1.6) can be solved. The measured noise vector v and deviation vector x can be linearized, and the linear equation after N measurements can be obtained as:
[0087] z=A(x+b)=Ax+Ab(1.7)
[0088] among them:
[0089] { z = [ ... , x a , i - x b , i , y a , i - y b , i , ... ] T x = [ Δr a , Δθ a , Δr b , Δθ b ] T , ( i = 1 , 2 , ... , N ) - - - ( 1.8 )
[0090] A = sinθ a , 1 r a , 1 cosθ a , 1 - sinθ b , 1 - r b , 1 cosθ b , 1 cosθ a , 1 - r a , 1 sinθ a , 1 - cosθ b , 1 r b , 1 sinθ a , 1 sinθ a , 2 r a , 2 cosθ a , 2 - sinθ b , 2 - r b , 2 cosθ b , 2 cosθ a , 2 - r a , 2 sinθ a , 2 - cosθ b , 2 r b , 2 sinθ b , 2 . . . . . . . . . . . . sinθ a , N r a , 2 cosθ a , 2 - sinθ b , 2 - r b , 2 cosθ b , 2 cosθ a , 2 - r a , 2 sinθ a , 2 - cosθ b , 2 r b , 2 sinθ b , 2 - - - ( 1.9 )
[0091] Furthermore, the estimation of the sensor deviation vector x based on the maximum likelihood method can be obtained as:
[0092]
[0093] The calculation process of the fusion calculation unit 113 is described below by taking the fusion calculation of the AIS information and the sea radar information as an example.
[0094] First, the modeling module builds a data fusion model of AIS information and radar information based on the characteristics of AIS information and radar information, combined with a distributed fusion structure of data fusion, such as figure 2 Shown, where O 1 Represents the output track of the first target, O n Represents the output track of the nth target. There are n ships in the sea, and each ship is equipped with AIS and sea-to-sea radar at the same time. Since the information of the AIS target and the radar target comes from independent sensors, the data of the two sensors need to be calibrated in space and time. Make the two unified in space and time; then, match and correlate n AIS target tracks and n radar tracks to extract the AIS and radar data of the same target; finally combine the AIS data and radar data of the same target Perform fusion processing to get the best result of data fusion.
[0095] Then, the preprocessing module performs spatial calibration and time calibration on all information according to the data fusion model. The target position data obtained by AIS is expressed as longitude and latitude, while the target position data obtained by radar is expressed as distance and azimuth. Therefore, before associating the target information, the two need to be uniformly transformed into rectangular coordinates.
[0096] AIS coordinate transformation: The target location information of AIS comes from GPS receivers, and GPS adopts the internationally accepted WGS-84 coordinate system. The origin of the WGS-84 coordinate system is located at the center of mass of the earth, the Z axis points to the pole direction of the earth, the X axis points to the intersection of the starting meridian and the equator, and the Y axis, X axis, and Z axis form a right-handed system. The ellipsoid parameters used in WGS-84 series are:
[0097] a=6378137.0000000000(m)
[0098] b=6356752.3142(m)
[0099] c=6399593.6258(m)
[0100] f=1/298.257223563
[0101] e2 = 0.0066943799013
[0102] e’2 = 0.00673949674227
[0103] Gauss-Krüger projection is used between the WGS-84 system and the plane rectangular coordinate system, see formula (2-1).
[0104]
[0105] Where: X and Y are the horizontal and vertical coordinates of the point's planar rectangular coordinate system; Is the geographic coordinates of the point, measured in radians, λ is calculated from the central meridian, indicating longitude, Is latitude; S is from equator to latitude Arc length of meridian at, N is latitude The radius of curvature of the 卯Unity circle at the position; η is the second eccentricity of the earth, and a and b are the semi-long and short semi-axes of the earth ellipsoid respectively.
[0106]
[0107] V=1+η 2
[0108] V = c v
[0109]
[0110] among them:
[0111] β 0 = 1 - 3 4 e , 2 + 45 64 e , 4 - 175 256 e , 6 + 11025 16384 e , 8
[0112] β 2 = Β 0 -1
[0113] β 4 = 15 32 e , 4 - 175 384 e , 6 + 3675 8192 e , 8
[0114] β 6 = - 35 96 e , 6 + 735 2048 e , 8
[0115] β 8 = 315 1024 e , 8
[0116] Radar coordinate transformation: The coordinates of the radar data are polar coordinates, and the target position data is expressed as distance (R) and azimuth (θ), which is transformed into the data expressed in a rectangular coordinate system x R (x axis component) and y R (y-axis component):
[0117] x R = R cos θ y R = R sin θ - - - ( 2 - 2 )
[0118] Multi-sensor fusion due to the different sampling rate of each sensor, different starting time and other reasons, there is a problem of unsynchronized observation data in time. Specifically, the scanning period of the radar on the target is generally fixed, while the reporting period of AIS changes with the navigation state of the ship. The radar and AIS systems have different data rates, in order to perform subsequent track-related processing. , The information of the two sensors should be unified to the same moment.
[0119] If the sampling time of AIS and radar in the same time period are as follows:
[0120] AIS sampling time sequence: T Ai = T Ai1 , T Ai1 ,...T Aif; (I = 1, 2, ... n)
[0121] Radar sampling time sequence: T Rj = T Rj1 , T Rj1 ,...T Rjt; (J = 1, 2, ... n)
[0122] When we will |t Ai -t A(i-1) | (Represents the time difference between the previous moment and the next moment of AIS) and |t Rj -t R{j-1 )|(representing the time difference between the previous time and the next time of the radar) Compared with the sensor, the sensor with a small difference is used as the reference sampling time at each sampling time under study, and then the data with high data rate is used for interpolation or extrapolation processing (Lagrangian interpolation), find their position data at each sampling time.
[0123] Suppose the target obtained from the AIS data is at t A(i-1) , T A(i+1) The position coordinates at time are (x A(i-1 , Y A(i-1) ), (x A(i+1 ), y A(i+1) ), when |t Ai -t A(i-1) |≤|t Rj -t R(j-1) | When the formula (2-3) can be obtained with the radar t Rj AIS position data at time (x Aj , Y Aj ). Other information such as speed and course can also be obtained by this method.
[0124] x A j = x A ( i - 1 ) + ( t R j - t A ( i - 1 ) ) ( x A ( i + 1 ) - x A ( i - 1 ) ) ( t A ( i + 1 ) - t A ( i - 1 ) ) y A j = y A ( i - 1 ) + ( t R j - t A ( i - 1 ) ) ( y A ( i + 1 ) - y A ( i - 1 ) ) ( t A ( i + 1 ) - t A ( i - 1 ) ) - - - ( 2 - 3 )
[0125] When the target is moving steadily, we use the n sampling moments of the radar as the reference sampling moment. When the target state changes rapidly, because the period of the AIS providing data becomes shorter than the scanning period of the radar, we change to each sampling of the AIS The time is used as the reference sampling time. This method uses the automatic adjustment of the sampling time as the reference data according to the maneuver of the target.
[0126] Then, the correlation module uses the nearest neighbor method to perform track correlation.
[0127] Let (x Ak , Y Ak ), (x Rk , Y Rk ) Are the target positions calculated by time and space calibration of AIS and ARPA at time k. The current ARPA radar tracking gates are Δx and Δy. The track correlation between AIS and ARPA is divided into the following two cases:
[0128] There is only one target in the tracking gate:
[0129] When the target position information of AIS and radar satisfies formula (2-3), the track correlation between AIS target and ARPA radar target is realized.
[0130] { | x A k - x R k | ≤ Δ x | y A k - y R k | ≤ Δ y - - - ( 2 - 4 )
[0131] There are multiple targets in the tracking gate:
[0132] In order to improve the quality of the correlation, it is necessary to perform m correlation tests, and establish a distance function ρ between the correlation test samples ij , I represents the data of the i-th group of AIS, j represents the data of the j-th group of radars, i and j are independent and uncorrelated.
[0133] ρ i j = 1 2 ( ( X n - 1 m ( x A i n - x R j n ) 2 ) 1 2 m + ( X n - 1 m ( y A i n - y R j n ) 2 ) 1 2 m ) , ( n = 1 , 2 , ... m ) - - - ( 2 - 5 )
[0134] According to formula (2-5), for the determined group j ARPA data, to find the data corresponding to the AIS at the same target, as long as i is found, the i makes the distance ρ ij Get the minimum value.
[0135] Finally, the target fusion is performed by the fusion processing module.
[0136] Tracks determined to be the same target through track correlation inspection can be processed by track fusion to obtain the track. The target track fusion method uses an intuitive and efficient weighted average method, which can directly perform statistical weighting processing on the calibrated data, reduce information loss, and help improve the accuracy of the fusion track. Let the error of AIS measurement be σ A 2 , The weighting factor is w 1 , The error variance of radar measurement is σ R 2 , The weighting factor is w 2 , The value after fusion is X.
[0137] The mean square error of the fusion value is the formula (2-6):
[0138] σ 2 =E[(x-X) 2 ]=E[w 1 2 (x-x 1 ) 2 +w 2 2 (x-x 2 ) 2 +2w 1 w 2 (x-x 1 )(x-x 2 ))(2-6)
[0139] among them:
[0140] E[(x-x 1 )(x-x 2 )]=0
[0141] Further solving can get formula (2-7):
[0142] σ 2 = E [ X p = 1 2 w p 2 ( x - x p ) 2 ] = w 1 2 σ 1 2 + w 2 2 σ 2 2 - - - ( 2 - 7 )
[0143] Today σ 2 Take the minimum value, then σ 2 Find the derivative, now the derivative is 0, substituting into equation (2-8):
[0144] { w 1 + w 2 = 1 X = w 1 x 1 + w 2 x 2 - - - ( 2 - 8 )
[0145] Can get this time w 1 = σ 2 2 σ 1 2 + σ 2 2 , w 2 = σ 1 2 σ 1 2 + σ 2 2 .
[0146] According to the judgment conclusion of the optimal weight factor, the weight factor can be determined as shown in formula (2-9):
[0147] W A L = σ R L 2 σ R L 2 + σ A L 2 , W R L = σ A L 2 σ R L 2 + σ A L 2 W A θ = σ R θ 2 σ R θ 2 + σ A θ 2 , W R θ = σ A θ 2 σ R θ 2 + σ A θ 2 W A V = σ R V 2 σ R V 2 + σ A V 2 , W R V = σ A V 2 σ R V 2 + σ A V 2 - - - ( 2.9 )
[0148] Where:
[0149] σ RL 2 , Σ AL 2 ——Ranging accuracy of radar and AIS;
[0150] σ Rθ 2 , Σ Aθ 2 ——Angle measurement accuracy of radar and AIS;
[0151] σ RV 2 , Σ AV 2 ——Radar and AIS speed measurement accuracy.
[0152] The fusion target data can be obtained by formula (4-10).
[0153] L = W R L L R + W A L L A θ = W R θ L R + W A θ L A V = W R V L R + W A V L A - - - ( 2 - 10 )
[0154] The situation assessment subsystem 12 is connected to the information fusion subsystem 11, and the situation assessment subsystem 12 is used to perform target situation assessment on the sea area target information and the airspace target information to obtain the target threat probability. The situation assessment subsystem 12 is connected with multiple processing platforms, and distributes the received sea area target information and airspace target information to multiple processing platforms, and uses multiple processing platforms for detection and processing. When a certain platform finds a threat target, it will notify the situation Evaluation subsystem 12, so that the situation assessment subsystem 12 notifies other platforms of the threat target, so that each processing platform no longer depends on its own detection equipment, but can share the findings of other platforms through system information, thereby improving security defense and military decision-making The concealment and effectiveness.
[0155] The situation assessment subsystem 12 includes a preliminary assessment unit 121, a depth assessment unit 122, and a display unit 123. The preliminary evaluation unit 121 is used to evaluate and judge the target attributes included in the sea area target information and the airspace target information, specify corresponding evaluation values ​​for the sea area target information and the airspace target information, and obtain the corresponding target motion trajectory and target attribute information; Among them, the sea area target information and the air space target information include target information, target attributes, and static information. The target attributes include heading, speed, target type and other information. After being evaluated by the preliminary evaluation unit 121, it outputs information such as target movement trajectory, target type, target size, target number of times, target threat, target interception success rate, target interception time and other information. The preliminary evaluation unit 121 can be combined with other processing platforms. Meteorological resources, administrative resources, historical resources, and human resources, statistics of the number of times of visits and threats to the target, etc.
[0156] The depth evaluation unit 122 is connected to the preliminary evaluation unit 121, and is used to perform in-depth evaluation of the sea area target information and air space target information whose evaluation value exceeds the warning value, and use neural network algorithms to calculate the sea area target information and air space target information to obtain the target Threat probability and target-related information; for a target whose evaluation value exceeds the warning value, the target threat degree exceeds the set warning value, and the threat degree of the target is greater. Then the target is evaluated in depth, and the in-depth evaluation uses neural network algorithms. Input the information obtained by the preliminary evaluation unit 121, including multiple targets and multiple tracks, and output target threat probabilities and target related information according to the neural network algorithm. Neural network algorithm is an algorithmic mathematical model that imitates the behavioral characteristics of biological neural network and performs distributed and parallel information processing. This kind of network relies on the complexity of the system and achieves the purpose of processing information by adjusting the interconnection between a large number of internal nodes. Such as Figure 5 As shown, the neurons in the network are arranged in layers, and each neuron is only connected to the neuron of the previous layer. The first layer is the input layer 21, which is composed of linear transformation units, the middle is a hidden layer 22, the number of hidden layers 22 can be one or more layers, and the uppermost layer is the output layer 23. Both the hidden layer 22 and the output layer 23 are composed of nonlinear transformation units.
[0157] To use the neural network for situation assessment and prediction, the situation assessment network must first be constructed. The main steps are as follows: 1. Determining the number of layer nodes: According to preliminary assessment information of the situation of multiple platforms, the number of input layer nodes can be determined. For example, the input node includes evaluation indicators such as target combat capability, angle, distance, height, and speed; the number of intermediate hidden layer nodes is related to the number of input layer nodes, the number of output layer nodes, and the difficulty of dealing with the problem. It is usually used The empirical formula delimits the range of hidden nodes, and then the optimal number of nodes is determined after repeated training. 2. Network sample generation: The engineering fuzzy set method is used to determine the weight coefficients of factors to form the initial training sample, and then adjust and optimize through the expert correction method to generate the final network training sample. 3. Network training: A learning algorithm combining standard gradient descent method and exponential gradient descent method is used to train the neural network to make the error meet certain requirements and ensure that the network has strong robustness. After the situation assessment network is completed, the system can automatically estimate and predict the situation of the abnormal target after the target completes the element picking and identifies the abnormal target. It can also comprehensively refer to other aspects of information, such as video, radar and other resources, to manually determine whether it is necessary Estimate and predict the abnormal target.
[0158] The display unit 123 is connected to the preliminary evaluation unit 121 and the depth evaluation unit 122, and is used to display the evaluation value, target motion trajectory, and target attribute information obtained by the preliminary evaluation unit 121, and display the target threat probability and target related information obtained by the depth evaluation unit 122 . Through the display of the display unit 123, the situation of the target is intuitively displayed to the commander for the commander to deploy globally.
[0159] The security defense subsystem 13 is connected to the situation assessment subsystem 12, and the security defense subsystem 13 is used to classify the target threat level according to the target threat probability, and match the target threat level with the corresponding target defense plan.
[0160] The security defense subsystem 13 includes a threat level judgment unit 131 and a target defense decision unit 132 connected to the threat level judgment unit 131. The threat level judgment unit 131 is used to classify the target threat level, and the target defense decision unit 132 is used to The target threat level matches the corresponding target defense plan. The security defense subsystem 13 displays the corresponding target defense plan for command personnel to control.
[0161] The threat level judgment unit 131 is equipped with a storage module, a quantification module, and a judgment module. The storage module stores an item comparison table. The item comparison table includes a target type threat level comparison table, a target speed threat level comparison table, and a target heading threat level Comparison table, target distance threat level comparison table, and target threat level division table, and assign corresponding weights to target type, target speed, target heading, and target distance to form a weight table and store it in the storage module; the quantification module is used to target the target The indicators in the threat probability are quantified to obtain the target type, target speed, target heading, and target distance corresponding to the target threat probability; the judgment module is connected with the quantization module and the storage module for the target type obtained by the quantization module , Target speed, target heading, and target distance look up the item comparison table in the storage module to obtain the corresponding threat value, look up the weight table to obtain the corresponding weight, and calculate the threat level by weighted summation of the threat value and the corresponding weight According to the calculated threat level value, search the target threat level classification table to obtain the matching threat level as the target threat level.
[0162] The threat level judging unit 131 estimates and analyzes the target’s friend or foe attributes and its position, speed, heading, type and other information in quantitative form on the basis of situation assessment, and selects appropriate weights for quantitative information according to the set threat judgment criteria , Perform weighted summation to calculate the threat value to analyze and judge the threat level, complete the threat level division, and implement the defense processing of maritime and air targets. Set the evaluation index of the sea area target as target type, target speed, target route, and target distance, and set the evaluation index of the airspace target as target type, target speed, target heading angle, target distance, and target height, and respectively Assign weight to each evaluation index,
[0163] The types of maritime targets are divided into military ships and civilian ships, and are classified into 1 to 5 according to the threat situation for threat estimation. The specific description is shown in Table 1.1.
[0164]
[0165]
[0166] Table 1.1 is a comparison table of threat levels of target types in sea areas
[0167] From the target's speed and speed changes, we can infer the other party's basic intentions, and estimate whether the target poses a threat to us and the size of the threat. The specific description is shown in Table 1.2.
[0168] Speed ​​(Kn)
[0169] Table 1.2 is the comparison table of target speed threat level in sea area
[0170] In different time periods, if the target's course remains the same or the course of each time basically remains parallel, the threat of the target is not great. If the relative routes are inconsistent or not parallel, but change repeatedly and intersect, then the threat level of the target is considered to be high, which may pose a threat to us. The quantitative description is shown in Table 1.3.
[0171] Route
[0172] Table 1.3 is the comparison table of threat level of target course in sea area
[0173] The distance is divided into 5 levels (see Table 1.4), and each distance range corresponds to a different defense stage and a different threat level of the target.
[0174]
[0175] Table 1.4 is the comparison table of the threat level of target distance in the sea area
[0176] According to the types of aerial targets, the threat level of the enemy and unknown targets close to a certain firepower unit or the defended key area is classified, and divided into 9 levels. The specific description is shown in Table 1.5.
[0177]
[0178] Table 1.5 is a comparison table of threat levels of target types in airspace
[0179] The air target speed is divided into 9 levels (see Table 1.6). When the target speed is faster, the time to reach the defense is shorter, and the threat may be greater.
[0180]
[0181]
[0182] Table 1.6 is the comparison table of target velocity threat level in airspace
[0183] The target heading angle (0°~180°) is divided into 7 areas, and the threat level is quantified in turn. The specific description is shown in Table 1.7.
[0184]
[0185] Table 1.7 is the comparison table of threat level of target heading in airspace
[0186] The smaller the distance between the target and the defended object, the greater the threat of the target. The specific description is shown in Table 1.8.
[0187]
[0188]
[0189] Table 1.8 is the comparison table of the threat level of target distance in the airspace
[0190] The higher the altitude of the air target, the smaller the threat. The threat of attack in the airspace is divided into 4 levels, as shown in Table 1.9.
[0191]
[0192] Table 1.9 is a comparison table of target altitude threat levels in airspace
[0193] The DELPHI method (expert opinion method) is used to solicit and investigate expert opinions on marine and air target indicators, and after statistical processing, the threat estimation weight distribution results are obtained, as shown in Table 2.1.
[0194] Sea target
[0195] Table 2.1 Weight quantification of threat indicators
[0196] For example, suppose there are 5 batches of ship formations and 6 batches of air formations attacking our base. The target attributes are shown in Table 2.2 and 2.3.
[0197]
[0198]
[0199] Table 2.2 Indicators of maritime threat targets
[0200]
[0201] Table 2.3 Indicators of air threat targets
[0202] The quantified value and estimated value of the threat degree corresponding to each attribute of the target can be obtained from the threat degree of each target attribute, as shown in Table 2.4 and 2.5. Therefore, the order of threat levels of the 5 batches of targets is: batch 2> batch 1> batch 3> batch 4> batch 5; the order of threat levels of 6 batches of targets is: batch 3> batch 1> Batch 5>Batch 4>Batch 6>Batch 2.
[0203]
[0204] Table 2.4 Quantification and estimation of threat weights of marine targets
[0205]
[0206] Table 2.5 Quantification and estimation of air target threat weight
[0207] The target defense decision unit 132 is provided with a first threat threshold and a second threat threshold; when the target threat level is lower than the first threat threshold, the matching target defense plan is a prompt alert; when the target threat level is higher than the first threat threshold, When it is lower than the second threat threshold, the matched target defense plan is target interception; when the target threat level is higher than the second threat threshold, the matched target defense plan is target attack. The target defense plan is executed after the commander issues an execution order.
[0208] The interception decision after the threat assessment of the target mainly includes: performing interception adaptability inspection, such as stopping shooting, interception prohibition, identifying the target as an enemy or unknown target, etc.; judging the characteristics of the target, according to the target's movement parameters and navigation characteristics, Determine whether the target meets the requirements of the firepower unit; predict the current launch encounter point and determine whether the target is covered; select the most advantageous guided launch device to intercept the target to maximize the overall effectiveness of the defense system.
[0209] During the coastal defense process, according to the enemy’s attack on the sea and airspace of our base, the following coastal defense processing and response decisions will be carried out: When the enemy prepares for firepower, use fortifications to rigorously protect and uninterruptedly control the sea and air. The situation, ascertain the enemy’s main landing direction and location, as well as the formation and navigation speed of the landing forces, and prepare for landing against the enemy; use long-range artillery firepower and aviation firepower to assault the enemy fire support ships, attack enemy aircraft and attack helicopters , Blockade the flight formation channel, and cover the obstacles in the sea; when the enemy deploys or transfers and approaches the shore, concentrate fire on the enemy’s landing ships and fire support ships; when the enemy hits the shore and makes a beach landing, concentrated fire to destroy the enemy Landing tools and armored vehicles kill and injure the enemy’s landing troops; long-range artillery and aviation continue to assault the enemy’s subsequent echelon and fire support fleets; when the enemy lands, use various firepower to support the frontier troops to guard the main points, mass destruction and prevention Follow-up echelons approach and land; when the enemy breaks into the defensive position, a combination of fire assault, maneuvering obstacles and sticking to the key points are used to prevent them from consolidating and expanding the landing field, and the mobile reserve team will implement counter-attacks in a timely manner to strive to wipe out the enemy’s foothold. At the time of stability, regain the key points on the beach; when the enemy is landing in the defensive depth, it will maneuver firepower at the right time to assault the enemy, and use the anti-airborne reserve team to encircle and destroy the enemy with the cooperation of neighboring forces.
[0210] The following describes the coastal monitoring and defense decision-making method based on multi-dimensional space integration of the present invention.
[0211] Such as Figure 4 As shown, the coastal monitoring and defense decision-making method based on multi-dimensional space fusion of the present invention includes: performing step S21, acquiring sea information in the sea area and performing data fusion to obtain sea area target information, the sea information including the sea information center information of the sea area , VTS information, AIS information, and sea radar information, and data fusion of the acquired sea information center information, VTS information, AIS information, and sea radar information to obtain target information in the sea area; execute step S22 to obtain airspace in the airspace Information and data fusion to obtain airspace target information. Air situation information includes airspace ATC information, satellite information, aircraft information, and aircraft information. The acquired ATC information, satellite information, aircraft information, and aircraft information are data fused Obtain the airspace target information; execute step S23, perform target situation assessment on the sea area target and airspace target information, and perform target situation assessment on the sea area target information and airspace target information to obtain the target threat probability; execute step S24 to classify the target threat level, And match the corresponding target defense plan for the target threat level, divide the target threat level according to the target threat probability, and match the corresponding target defense plan for the target threat level.
[0212] Data fusion of maritime information center information, VTS information, AIS information, and sea radar information in the sea area and data fusion of ATC information, satellite information, aircraft information, and aircraft information in the airspace include: data fusion to be performed The sea information or air information is based on the distributed fusion structure to establish a data fusion model; the sea information or air information is preprocessed according to the data fusion model, including spatial calibration and time calibration, and the sea information or air information is obtained through spatial calibration. The information is transformed into the same coordinate system, and the sea situation information or air situation information is unified to the same time through time calibration; the nearest neighbor method is used to correlate each sea situation information or air situation information to obtain each sea situation The target included in the information or air situation information and the track corresponding to the target; the weighted average method is used to fuse the same target in the sea situation information or air situation information, and the track corresponding to the same target is fused to obtain Fusion trajectory; the fusion trajectory is associated with the corresponding target to form sea area target information or airspace target information. The fusion method is the same as the principle of the fusion operation unit in the coastal monitoring and defense decision-making system based on multi-dimensional space fusion of the present invention, and will not be repeated here.
[0213] Target situation assessment of sea area target information and airspace target information includes preliminary assessment and depth assessment, preliminary assessment of sea area target information and airspace target information, assessment and judgment based on target attributes included in sea area target information and airspace target information, it is sea area Target information and airspace target information specify corresponding evaluation values, and get the corresponding target motion trajectory and target attribute information; display the sea target information and airspace target information and the corresponding evaluation value, target motion trajectory, and target attribute information, Display the results of the preliminary assessment to provide guidance for commanders to facilitate judgment based on combat experience and improve the timeliness of intervention by management personnel; conduct in-depth assessment of target information in the sea and airspace whose assessment value exceeds the alert value, using neural networks The algorithm calculates the target information in the sea area and the target information in the airspace to obtain the target threat probability and target related information; the target threat probability and target related information are displayed.
[0214] The classification of target threat levels according to the target threat probability includes: establishing a project comparison table, including a target type threat level comparison table, a target speed threat level comparison table, a target heading threat level comparison table, and a target distance threat level comparison table. Type, target speed, target heading, and target distance assign corresponding weights; quantify the indicators in the target threat probability to obtain target type, target speed, target heading, and target distance, and find out from the project comparison table The corresponding threat value is calculated by weighted summation based on the threat value and the corresponding weight to obtain the threat degree value; the threat degree value is matched with the corresponding threat level to obtain the target threat level. The specific threat level division process is the same as the target threat level division in the coastal monitoring and defense decision-making system based on multi-dimensional space integration of the present invention, and will not be repeated here.
[0215] Matching the corresponding target defense plan for the target threat level includes: setting the first threat threshold and the second threat threshold; when the target threat level is lower than the first threat threshold, the matched target defense plan is a reminder; when the target threat level is high When the first threat threshold is lower than the second threat threshold, the matched target defense plan is target interception; when the target threat level is higher than the second threat threshold, the matched target defense plan is target attack. The defense decision is the same as the defense decision in the coastal monitoring and defense decision-making system based on multi-dimensional space integration of the present invention, and will not be repeated here.
[0216] The present invention has been described in detail above with reference to the embodiments of the drawings, and those of ordinary skill in the art can make various changes to the present invention based on the above description. Therefore, some details in the embodiments should not constitute a limitation to the present invention, and the present invention will take the scope defined by the appended claims as the protection scope of the present invention.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Method and device for forecasting reservoir yield

InactiveCN101906966AHigh precisionSurveyWellheadVolume factor
Owner:PETROCHINA CO LTD

Classification and recommendation of technical efficacy words

Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products