Unlock instant, AI-driven research and patent intelligence for your innovation.

Fish feeding state detection method

A state detection, fish technology, applied in neural learning methods, image analysis, image enhancement and other directions, to achieve the effect of simple and effective quantitative parameters

Pending Publication Date: 2022-06-10
YULIN NORMAL UNIVERSITY
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Using the optical flow method to extract the motion features of fish between frames requires post-processing of the video. The feature extraction process is cumbersome, not real-time and continuous, and cannot monitor the feeding status of fish for a long time. At the same time, video acquisition is easily affected by lighting and environmental factors. The impact is also not conducive to engineering applications
[0004]In addition, most of the fish feeding state classification algorithms are based on ideal conditions, such as recirculating aquaculture, good light source lighting, resulting in the application of such algorithms often Restricted by conditions such as generation cost, lighting environment, and water quality clarity, it also has the problem of difficult deployment, and cannot be applied to complex factory farming environments.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fish feeding state detection method
  • Fish feeding state detection method
  • Fish feeding state detection method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0047] Example 1

[0048] refer to Figure 1~4 , a fish feeding state detection method, comprising:

[0049] The depth image of a complete feeding process of fish school is collected by depth camera, such as figure 1 shown;

[0050] Convert the depth image into a depth pseudo-color map according to the depth change, such as image 3 shown;

[0051]Mark the depth pseudo-color map as strong feeding, moderate feeding, weak feeding, and no feeding;

[0052] Build a simple convolutional neural network model, such as Figure 4 As shown in , the first detection model is obtained by training a simple convolutional neural network model using the labeled depth pseudo-color map;

[0053] Use the first detection model to actually detect the feeding state of the fish.

[0054] In this embodiment, the model of the depth camera is Azure KinectDK, such as figure 1 As shown, the depth camera 1 continuously emits modulated infrared light pulses into the rearing pond through the infrared...

Embodiment 2

[0066] Example 2

[0067] Further, on the basis of the examples, Example 2 also includes:

[0068] Calculate the difference between the pixel sums of two adjacent frames of depth images, and use the difference as the quantitative index E(k) to characterize the feeding intensity of the fish school,

[0069]

[0070]

[0071]

[0072] Among them, f(k) represents the sum of the target pixel points of the k-th depth image, and Z(x, y) represents the depth value (in mm) at the coordinates X and y in the depth map, that is, the target to depth For the vertical distance value of the camera, the pixels of the depth image are M*N. Since the pixels of the depth image are 640*576, the values ​​of x and y are 0~640 and 0~576 respectively, that is, M=640, N=576, Z 0 ,Z 1 They are 500 and 580 respectively; the formula (6) indicates that when the depth value of the pixel point is in the range of 500-580mm (this area is the fish feeding area), the value of this pixel point is set ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fish ingestion state detection method, relates to the technical field of fish culture, and solves the technical problem that an existing fish shoal ingestion state classification method cannot be applied to a complex industrial culture environment, and the method comprises the steps: collecting a depth image of a fish shoal in a one-time complete ingestion process through a depth camera; converting the depth image into a depth pseudo-color image according to the depth change; marking the deep pseudo-color map according to strong ingestion, medium-intensity ingestion, weak ingestion and non-ingestion; constructing a simple convolutional neural network model, and training the simple convolutional neural network model by using the marked deep pseudo-color map to obtain a first detection model; and the first detection model is used for actually detecting the fish feeding state. According to the method, a depth camera is combined with an image processing program, so that a fish school feeding state depth map and feeding state sequence data can be obtained in real time, a relatively simple convolutional network can be used for classification, or a simple recurrent neural network can be used for classification.

Description

technical field [0001] The invention relates to the technical field of fish farming, more specifically, it relates to a method for detecting the feeding state of fish. Background technique [0003] At present, the research on fish feeding behavior has become a hotspot by combining the feature vector reflecting the feeding state of fish with the neural network classification model. For example, using the LeNet5 Convolutional Neural Network (CNN) to classify near-infrared images of fish feeding, the average accuracy rate reaches 90%. Using the Dual Stream Recurrent Network (DSRN) with VGGNet and ResNet as the backbone network structure, the 2-classification of 20 frames of fish feeding images was realized, and the classification accuracy rate reached 81.4%. Due to the complex network structure of DSRN, the calculation There are many parameters and long calculation time, so a computer equipped with a high-performance graphics card is required, and the cost is high, which is ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/90G06T7/73G06T7/50G06K9/62G06V10/764G06V10/82G06N3/04G06N3/08
CPCG06T7/0012G06T7/90G06T7/75G06T7/50G06N3/08G06T2207/10028G06N3/047G06N3/045G06F18/24317G06F18/2415
Inventor 黄平郑金存闭吕庆黄添林廖益杰
Owner YULIN NORMAL UNIVERSITY