High-precision underwater imaging method and device

An imaging method and high-precision technology, applied in image enhancement, image analysis, image data processing, etc., can solve the problems of learning a priori information, less data sets in underwater scenes, etc., and achieve the effect of avoiding dependence and high contrast.

Pending Publication Date: 2022-03-18
BEIJING INSTITUTE OF TECHNOLOGYGY
0 Cites 0 Cited by

AI-Extracted Technical Summary

Problems solved by technology

However, due to the particularity of the underwater environment, there are very few real underwater scene data...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Method used

As a kind of example, the present invention sets up loss function, comprises two parts, and the first part is the loss function of scene radiation sub-network, and the second part is the degradation degree of underwater image, utilizes Adam gradient descending algorithm to optimize network parameter, makes The...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Abstract

The invention discloses a high-precision underwater imaging method and device, and the method comprises the steps: constructing a scene radiation sub-network, inputting a collected underwater distorted image, and outputting a scene radiation pattern; estimating a transmissivity graph and a background light graph from the underwater distorted image, and synthesizing the scene radiation graph, the transmissivity graph and the background light graph into a simulated underwater distorted image according to an underwater imaging formula; constructing a prior discrimination sub-network based on the underwater distorted image, inputting a scene radiation pattern, and outputting the underwater image degradation degree; and optimizing scene radiation sub-network parameters based on the underwater image degradation degree so as to output an optimized high-precision scene radiation pattern. According to the method, the problem that the underwater image has no true value when the underwater imaging problem is solved by using deep learning is solved, the dependence on a true value image is avoided, and a clear, high-contrast and color-corrected underwater image is recovered by using an underwater distorted image in combination with an underwater imaging model.

Application Domain

Technology Topic

High contrastImage degradation +9

Image

  • High-precision underwater imaging method and device
  • High-precision underwater imaging method and device
  • High-precision underwater imaging method and device

Examples

  • Experimental program(1)

Example Embodiment

[0040] It should be noted that the features of the present application and the features in the embodiments in the present application can be combined with each other in the case of an unable conflict. The present invention will be described in detail below with reference to the accompanying drawings.
[0041]In order to better understand the present invention, the technical solutions in the embodiments of the present invention will be described in contemplation, and It is an embodiment of the invention, not all of the embodiments. Based on the embodiments in the present invention, those of ordinary skill in the art may belong to the scope of the present invention in the range of the present invention without all other embodiments obtained without making creative labor.
[0042] Next, a high precision underwater imaging method and apparatus proposed in accordance with an embodiment of the present invention will be described with reference to the accompanying drawings.
[0043] figure 1 It is a flow chart of a high precision underwater imaging method according to an embodiment of the present invention.
[0044] Such as figure 1 As shown, the high precision underwater imaging method includes the following steps:
[0045] S1, build a scene radiation subnet, input to the collected underwater distortion image, output as a scene radiation map.
[0046] S2, from the water distortion image estimation transmitter map and background light map, synthesize the scene radiation map, transmittance map, and background light map according to the underwater imaging formula;
[0047] S3, based on underwater distortion images, construct a prior verifying union network, input as a scene radiation map, output underwater image degradation degree;
[0048] S4, based on the degree of underwater image degradation, optimize scene radiation subnet network parameters to output optimized high-precision scene radiation maps.
[0049] As an example, the present invention includes, but is not limited to, an image having a three-way channel having a three-way channel having a three-way channel having the same size as an input of the same size; wherein the self-focus mechanism includes a BN layer and a self-focus layer. And the RELU layer.
[0050] As an example, the present invention utilizes the non-local prior information of the color image and the decay of the underwater image red channel, estimates the transmittance map of the underwater image and the background light.
[0051] As an example, the prior verification subsystem of the present invention includes, but is not limited to the backbone structure in the form of T2TMODULE and Deep-Narrow in the T2T-VIT network, and the SIGMOID function is used in the last layer to normalize the output to [0 In the interval, the output is the degree of underwater image degradation.
[0052] As an example, the present invention establishes a loss function, including two parts, the first portion is the loss function of the scene radiation sub-network, the second portion is the degree of underwater image degradation, using the ADAM gradient drop algorithm to optimize network parameters, to make the synthesized simulation Underwater distortion images approximately collected underwater distortion images, and minimize the degree of underwater image degradation.
[0053] The embodiments of the present invention will be further illustrated in conjunction with the accompanying drawings.
[0054] figure 2 A network structure according to an embodiment of the present invention, such as figure 2 Looking:
[0055] First, a distortion underwater image i (x) is collected under the camera. In inputting the image into the network, introduce the self-focus mechanism, the input image block, do self-attention in each image block. The network includes a BN layer, a self-focus layer, and a RELU layer, and an image with a three-channel having a three-way channel with the same size using the residual structure. Using non-local estimation transmitter map Background light map 3 images are constructed in the way in the water physical imaging formula
[0056]
[0057] There are two losses in the network, the first one is:
[0058] E rec = || i (x) -i '(x) || 2
[0059] E rec The purpose of the design is to optimize the network parameters, thereby enabling the output of the scene radiation map to better degenerate underwater distortion images.
[0060] The second is:
[0061]
[0062] in for The brightness of the image, for The saturation of the image, the underwater image hue after the white balance, when the brightness of the image increases, the saturation will decrease, and the difference between the two becomes large. In a clear image, the difference between the two will be smaller, so this loss function is minimized.
[0063] A priori network uses the T2T Module and Deep-Narrow form of the T2T-VIT network, and the SIGMOID function is used in the last layer to normalize the output to the [0, 1] interval. When training is required to be underwater image-true image pairs, only a part of the training set is part of the underwater image is on the ground image. Put the scene radiation map In the priori network of entering training, the output is the degree of degradation of underwater image. PA to the loss function, minimize loss functions, update the scene radiation subnet, and output optimized scene radiation map J (x) is a clear high precision underwater image.
[0064] In summary, the present invention constructs a scene radiation sub-network, input to a collected underwater distortion image, output as a scene radiation map; estimated transmittance map and background light map from the acquired underwater distortion; map speed map, transmission And background light map synthesis simulated underwater distortion image; construct a prior verifying network, input as a scene radiation map, output underwater image degradation; optimize scene radiation subnet network parameters, so that the synthesized simulated underwater distortion image is approaching Underwater distortion images, and at the same time minimize the degree of underwater image degradation; output optimized high-precision scene radiation maps.
[0065] The high-precision underwater imaging method of the embodiment of the present invention has no true value for the underwater image facing underwater imaging problems, avoiding the dependence of the true value map, combined with underwater imaging model and visual enhancement Method, the underwater distortion image is used to restore clear, high contrast, and color correction underwater images.
[0066] In order to achieve the above embodiment, such as image 3 As shown, in this embodiment, high-precision underwater imaging device 10 includes: the first build module 100, the synthesis analog module 200, the second build module 300, and the image output module 400.
[0067] The first build module 100 is configured to build a scene radiation subnet, input to the acquired underwater distortion image, and output is a scene radiation map;
[0068] Synthesis Analog Module 200 for estimating transmission of transmittance map and background light from the water image formula, synthesizing the scene radiation, transmittance map, and background light graphs;
[0069] The second build module 300 is configured to construct a prior verifying frame based on the underwater distortion image, input as a scene radiation map, and output underwater image degradation;
[0070] The image output module 400 is configured to optimize the scene radiation subnet parameters based on the degree of underwater image, to output optimized high-precision scene radiation maps.
[0071] Further, the first build module 100 is further used to introduce the self-focus mechanism, and the residual structure is employed, and the output is an image having a three-way channel having the same size; wherein the self-focus mechanism includes a BN layer, from Attractive layer and the RELU layer.
[0072] According to the high-precision underwater imaging apparatus according to the embodiment of the present invention, through the first build module, it is used to construct a scene radiation subnet, input to the collected underwater distortion image, output as a scene radiation map; synthesis analog module for from water Lower disguise image estimation transmittance map and background light map, synthesize scene radiation map, transmittance map, and background light map according to underwater imaging formula; second build module for construction based on underwater distortion images, construct A prior prior discrimination network, input to scene radiation map, output underwater image degradation; image output module, used to optimize scene radiation subnet parameters based on underwater image degradation, to output optimized high-precision scene radiation picture. The present invention is aimed at the problem that the underwater image faced by the underwater imaging problem is solved, and the dependence of the real value map is avoided, combined with underwater imaging model and visual enhancement method, using a underwater distortion image recovery Clear, high contrast, and underwater images of color correction.
[0073] It should be noted that the foregoing explanation of the high-precision underwater imaging method embodiment is also suitable for use in a high precision underwater imaging device of this embodiment, and details are not described herein again.
[0074] Moreover, the term "first", "second" is used only for the purpose of describing, and cannot be understood as an indication or implies a relative importance or implicitting the number of indicated techniques. Thus, the features of "first", "second" are defined, and at least one of the features may be indicated or implicitly. In the description of the invention, the meaning of "plurality" is at least two, such as two, three, etc., unless otherwise specifically defined.
[0075] In the description of this specification, the description of the reference terms "one embodiment", "some embodiments", "example", "specific example", or "some example", etc., meant to conjunction with the specific characteristics described in connection with this embodiment. , Structures, materials or features are included in at least one embodiment or example of the present invention. In the present specification, the schematic representation of the above terms should not be directed to the same embodiments or examples. Moreover, the specific features, structures, materials or features described may be combined in any one or more embodiments or examples. In addition, those skilled in the art can combine and combine different embodiments or examples of the present specification, and different embodiments of the features or examples of the present specification.
[0076] Although the embodiments of the present invention have been shown and described above, it is understood that the above-described embodiments are exemplary, and cannot be understood as limiting the invention, and those of ordinary skill in the art can be described above. EXAMPLES Change, modify, replace, and variations.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

no PUM

Description & Claims & Application Information

We can also present the details of the Description, Claims and Application information to help users get a comprehensive understanding of the technical details of the patent, such as background art, summary of invention, brief description of drawings, description of embodiments, and other original content. On the other hand, users can also determine the specific scope of protection of the technology through the list of claims; as well as understand the changes in the life cycle of the technology with the presentation of the patent timeline. Login to view more.
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Similar technology patents

Device and method for intaking water from air

PendingCN109281353AAvoid dependenceImprove water production efficiencyDrinking water installationLiquid waterEngineering
Owner:FOURTH MILITARY MEDICAL UNIVERSITY

Classification and recommendation of technical efficacy words

  • Increase contrast
  • Avoid dependence

Autonomous ultra-short optical pulse compression, phase compensating and waveform shaping device

InactiveUS20060033923A1Increase intensityIncrease contrastRadiation pyrometryInterferometric spectrometrySpectral componentLight intensity
Owner:JAPAN SCI & TECH CORP

High fill ratio reflective spatial light modulator with hidden hinge

InactiveUS6992810B2Improve fill rateIncrease contrastOptical elementsSupporting wallMicro mirror
Owner:MIRADIA INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products