[0017] In order for those skilled in the art to better understand the technical solutions of the present invention, the present invention will be described in detail below with reference to the accompanying drawings and specific embodiments. The embodiments of the present invention are described in further detail below in conjunction with the accompanying drawings and specific embodiments, but are not intended to limit the present invention. The steps described herein, if there is no need for a contextual relationship with each other, the order in which they are described herein as an example should not be regarded as a limitation, and those skilled in the art should know that the order can be adjusted as long as It is enough not to destroy the logic between them and make the whole process impossible.
[0018] An embodiment of the present invention provides a training method for an underwater sea urchin image recognition model, please refer to figure 1 , figure 1 A flowchart of a training method for an underwater sea urchin image recognition model according to an embodiment of the present invention is shown. This training method includes:
[0019] Step S100, constructing an underwater sea urchin image recognition model.
[0020] In some embodiments, see figure 2 , is a schematic diagram of the underwater sea urchin image recognition model. The selection of the underwater sea urchin image recognition model is based on the SSD model and constructed by the following methods: The SSD (Single Shot MultiBoxDetector) model feature extraction part uses the convolutional layer of VGG, and the The two fully connected layers of VGG are converted into ordinary convolutional layers, and multiple convolutional layers are connected to the final detection and classification layer for regression. The SSD model constructed by the present invention is SSD300, the size of the input image is 300×300, the feature extraction part uses the convolutional layer of VGG16, and the two fully connected layers of VGG16 are converted into ordinary convolutional layers (conv6 and conv6 in the figure). conv7), followed by multiple convolutions (conv8_1, conv8_2, conv9_1, conv9_2, conv10_1, conv10_2), and finally a Global Average Pool is used to become a 1x1 output (Pool 11). We can see from the figure that SSD connects conv4_3, conv7, conv8_2, conv9_2, conv10_2, and Pool 11 to the final detection and classification layer for regression.
[0021] Step S200, acquiring an underwater sea urchin image data set. It should be noted that the underwater sea urchin image dataset includes at least one underwater sea urchin image, which can be composed by recalling existing underwater sea urchin images, or can be obtained through field collection. For example, the corresponding underwater sea urchin image is collected underwater by the collecting terminal to form the underwater sea urchin image dataset. The acquisition terminal can be a dedicated underwater imaging device, which collects multiple underwater sea urchin images or underwater sea urchin videos underwater. If it is an underwater sea urchin video, it can be extracted frame by frame to obtain several video frames. Each video frame is an underwater sea urchin image. The above-mentioned acquisition method of the underwater sea urchin image data set is just an example, and when the method is actually applied, the acquisition method includes but is not limited to the above-mentioned acquisition method.
[0022] Step S300, performing multi-scale color restoration (Multi Scale Retinex with Color Restoration, MSRCR) on the underwater sea urchin image data set.
[0023] In some embodiments, see image 3 , is a flow chart of multi-scale color restoration for the underwater sea urchin image dataset. Specifically, multi-scale color restoration is performed on the underwater sea urchin image dataset by the following formula (1):
[0024] Formula 1)
[0025] in, represents the reflection component of one of the channels; represents the color recovery factor of one of the channels, The expression of is shown in formula (2); Indicates the number of scales, generally the value is 3; means the first scale weighting coefficients, and; represents the first channels; means the first Gaussian filter function on a scale, The expression of is shown in formula (3);
[0026] Formula (2)
[0027] in, is the gain constant, generally the value is 46, The size of the value controls the strength of the nonlinearity, and the general value is 125.
[0028] Formula (3)
[0029] in expressed in the The scale parameter of the Gaussian wrapping function at each scale.
[0030] Step S400, processing the underwater sea urchin image data set by the dark channel prior method. The dark channel prior method is a conventional method in the field of image processing, which is not described in detail in this embodiment.
[0031] Step S500, image fusion is performed on the underwater sea urchin image data set of multi-scale color restoration and the underwater sea urchin image data set processed by the dark channel prior method.
[0032] In some embodiments, see Figure 4 and Figure 5 , the image fusion of the underwater sea urchin image data set of multi-scale color restoration and the underwater sea urchin image data set processed by the dark channel prior method includes:
[0033] Step S501, calculating the sharpness of the multi-scale color restored underwater sea urchin image and the underwater sea urchin image processed by the dark channel prior method.
[0034] In some embodiments, the sharpness of the multi-scale color restored underwater sea urchin image and the underwater sea urchin image processed by the dark channel prior method is calculated by the following formula (4):
[0035] Formula (4)
[0036] in M represents the number of lines in the image, N represents the number of columns; p Indicates the sharpness of the image, represents the magnitude of the grayscale change of the image, Represents the amount of change in the pixel pitch of the image.
[0037] Step S502: Calculate the fusion weight coefficient according to the definition.
[0038] In some embodiments, the fusion weight coefficient is calculated by the following formula (5):
[0039] Formula (5)
[0040] is the sharpness of the current image, is the sharpness of the other image, is the fusion weight coefficient.
[0041] Step S503, based on the fusion weight coefficient, the multi-scale color restored underwater sea urchin image and the underwater sea urchin image processed by the dark channel prior method are split according to the RGB three channels, and are fused according to the corresponding channels, Obtain the fused image.
[0042] Step S600, sharpening the fused image data set to obtain a training image data set.
[0043] In some embodiments, Gaussian filtering is performed on the fusion image obtained in step S500, and then the pixel value of the current image and the result after Gaussian filtering are made difference, and mapped to between 0-255, as shown in the following formula (6) :
[0044] Formula (6)
[0045] in, is the resulting image after sharpening, is the original image, is the image after Gaussian blurring, is the sharpening adjustment parameter, generally 0.6. When the above method is used to sharpen the image, the image noise can be reduced, and the image can also be smoothed to a certain extent.
[0046] Step S700, using the training image data set to train the underwater sea urchin image recognition model. Finally, the trained underwater sea urchin image recognition model is obtained.
[0047] Embodiments of the present invention provide a method for recognizing an underwater sea urchin image, the method comprising: recognizing an underwater sea urchin image by using an underwater sea urchin image recognition model trained by the training method according to each embodiment of the present invention.
[0048] In some embodiments, see Image 6 , is the flow chart of the underwater sea urchin image recognition method, and the underwater sea urchin image recognition method includes:
[0049] Step S601, performing multi-scale color restoration on the underwater sea urchin image. This step can be achieved by formula (1) - formula (3) as described above.
[0050] Step S602, the underwater sea urchin image is processed by the dark channel prior method.
[0051] Step S603, image fusion is performed on the multi-scale color restored underwater sea urchin image and the underwater sea urchin image processed by the dark channel prior method. This step can be achieved by formula (4) - formula (5) as described above.
[0052] Step S604, the fused image is sharpened and then input into the trained underwater sea urchin image recognition model for recognition. The sharpening process in this step can be achieved by the above formula (6).
[0053] An embodiment of the present invention provides a training device for an underwater sea urchin image recognition model, the device includes a processor, and the processor is configured to: construct an underwater sea urchin image recognition model; acquire an underwater sea urchin image data set; Perform multi-scale color restoration on the underwater sea urchin image data set; process the underwater sea urchin image data set by the dark channel prior method; combine the multi-scale color restored underwater sea urchin image data set with the dark channel prior method The processed underwater sea urchin image data set is subjected to image fusion; the fused image data set is sharpened to obtain a training image data set; the underwater sea urchin image recognition model is trained by using the training image data set.
[0054] In some embodiments, the processor is further configured to perform multi-scale color restoration on the underwater sea urchin image dataset by the following formula (1):
[0055] Formula 1)
[0056] in, represents the reflection component of one of the channels; represents the color recovery factor of one of the channels, The expression of is shown in formula (2); Indicates the number of scales, generally the value is 3; means the first scale weighting coefficients, and; represents the first channels; means the first Gaussian filter function on a scale, The expression of is shown in formula (3);
[0057] Formula (2)
[0058] in, is the gain constant, generally the value is 46, The size of the value controls the strength of the nonlinearity, and the general value is 125.
[0059] Formula (3)
[0060] in expressed in the The scale parameter of the Gaussian wrapping function at each scale.
[0061] In some embodiments, the processor is further configured to: calculate the sharpness of the multi-scale color restored underwater sea urchin image and the underwater sea urchin image processed by the dark channel prior method; calculate the fusion weight according to the sharpness coefficient; based on the fusion weight coefficient, the multi-scale color restored underwater sea urchin image and the underwater sea urchin image processed by the dark channel prior method are split according to the three RGB channels, and are fused according to the corresponding channels to obtain fused image.
[0062] In some embodiments, the processor is further configured to: calculate the sharpness of the multi-scale color restored underwater sea urchin image and the underwater sea urchin image processed by the dark channel prior method by the following formula (4):
[0063] Formula (4)
[0064] in M represents the number of lines in the image, N represents the number of columns; p Indicates the sharpness of the image, represents the magnitude of the grayscale change of the image, Represents the amount of change in the pixel pitch of the image.
[0065] In some embodiments, the processor is further configured to: calculate the fusion weight coefficient by the following formula (5):
[0066] Formula (5)
[0067] is the sharpness of the current image, is the sharpness of the other image, is the fusion weight coefficient.
[0068] Embodiments of the present invention provide an apparatus for recognizing images of underwater sea urchins. The apparatus includes a processor, and the processor is configured to: an image recognition model of underwater sea urchins trained by using the training methods described in the embodiments of the present invention. Recognition of underwater sea urchin images.
[0069] In some embodiments, the processor is further configured to: perform multi-scale color restoration on the underwater sea urchin image; process the underwater sea urchin image through a dark channel prior method; Image fusion is performed with the underwater sea urchin image processed by the dark channel prior method; the fused image is sharpened and then input into the trained underwater sea urchin image recognition model for recognition.
[0070] An embodiment of the present invention also provides a non-transitory computer-readable medium storing instructions, and when the instructions are executed by a processor, the training method or the identification method according to any embodiment of the present invention is executed.
[0071] Furthermore, although exemplary embodiments have been described herein, the scope includes any and all implementations of the present invention with equivalent elements, modifications, omissions, combinations (eg, where various embodiments intersect), adaptations, or alterations example. Elements in the claims are to be construed broadly based on the language employed in the claims, and are not to be limited to the examples described in this specification or during the practice of this application, the examples of which are to be construed as non-exclusive. Therefore, this specification and examples are intended to be regarded as examples only, with the true scope and spirit being indicated by the following claims along with their full scope of equivalents.
[0072] The above description is intended to be illustrative and not restrictive. For example, the above examples (or one or more of them) may be used in combination with each other. For example, other embodiments may be utilized by those of ordinary skill in the art upon reading the above description. Additionally, in the foregoing Detailed Description, various features may be grouped together to simplify the present invention. This should not be construed as an intention that a feature of an unclaimed invention is essential to any claim. Rather, inventive subject matter may lie in less than all features of a particular inventive embodiment. Thus, the following claims are hereby incorporated into the Detailed Description by way of example or embodiment, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.