Unlock instant, AI-driven research and patent intelligence for your innovation.

A method for grading lens opacity based on ocular b-ultrasound images

A grading method and lens technology, applied in the field of medical image processing, can solve problems such as grading difficult lens turbidity, achieve accurate intelligent grading, reliable lens characteristics and turbidity identification results, and improve accuracy

Active Publication Date: 2022-08-02
SICHUAN UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the above-mentioned deficiencies in the prior art, the lens turbidity grading method based on the eye B-ultrasound image provided by the present invention solves the problem that the existing eye B-ultrasound image is difficult to carry out the gradation of the lens turbidity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for grading lens opacity based on ocular b-ultrasound images
  • A method for grading lens opacity based on ocular b-ultrasound images
  • A method for grading lens opacity based on ocular b-ultrasound images

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0052] like figure 1 As shown, a method for grading lens opacity based on an eye B-ultrasound image, comprising the following steps:

[0053] S1. Obtain the original eye B-ultrasound image and preprocess it;

[0054] S2. Input the preprocessed eye B-ultrasound image into the trained target detection network YOLOv3 to obtain an eyeball image;

[0055] S3. Input the eyeball image into the trained convolutional neural network DenseNet161, convolutional neural network ResNet152 and convolutional neural network ResNet101 respectively, and obtain the corresponding lens turbidity prediction result;

[0056] S4. Perform a majority vote on the three lens turbidity prediction results to obtain the final lens turbidity grading result.

Embodiment 2

[0058] The method for preprocessing the original eye B-ultrasound image in step S1 of the above embodiment is specifically:

[0059] Convert the original eye ultrasound image in DICOM format to an eye ultrasound image in PNG format with a size of 720×576.

Embodiment 3

[0061] The method for training the target detection network YOLOv3 in step S2 in the above-mentioned embodiment 1 is specifically:

[0062] A1. Construct an original eye B-ultrasound image dataset, and preprocess each original eye B-ultrasound image;

[0063] A2. Divide the preprocessed eye B-ultrasound image dataset into a target detection dataset and a feature extraction dataset;

[0064] A3. Manually mark the eyeball position in the B-scan image of the eye in the target detection data set, and normalize the eyeball coordinates of the marked eyeball position;

[0065] A4. Adjust the size of the eye B-ultrasound image marked with eyeball coordinates to 416×416, and input it into the target detection network YOLOv3 to extract the eyeball image, and complete the training of the target detection network YOLOv3;

[0066] Among them, the target detection network YOLOv3 uses the extracted three feature maps for eyeball detection, and the output size is 13×13×(a+b+c), 26×26×(a+b+c)...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a lens turbidity grading method based on an eye B-ultrasound image. The eyeball detection is performed on the original eye B-ultrasound image through a target detection network, so as to solve the problem that the eyeball only occupies a small part of the original image and the irrelevant background is strong For the problem of echo interference, multiple feature extraction networks extract the features of the eyeball area respectively. In each feature extraction network, an additional attention module is added to make the feature extraction network pay more attention to the lens area in the eyeball, that is, the features of the key extraction area. , so as to improve the accuracy of lens turbidity classification. The final model integration module integrates multiple feature extraction networks, synthesizes the advantages of each feature extraction network, and obtains more reliable lens features and turbidity identification results.

Description

technical field [0001] The invention belongs to the technical field of medical image processing, and in particular relates to a lens turbidity grading method based on an eye B-ultrasound image. Background technique [0002] The lens is the main refractive structure in the eyeball, and also the refractive interstitium with the ability to accommodate. The most important role of the lens is to focus at different distances. By changing the diopter, the focus of the eyeball when viewing near and far objects can accurately fall on the retina. The lens is biconvex oblate, transparent, and elastic. With age, the lens will age, affecting transparency and elasticity. If part or all of the lens is clouded for various reasons, causing the originally transparent lens to become opaque or milky white, it will block light from entering the eye and make it difficult to see things around. Mild opacity of the lens is the early stage of cataract, and if the opacity of the lens develops to a s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/00G06V10/25G06V10/774G06V10/80G06V10/82G06K9/62G06N3/04
CPCG06T7/0012G06T2207/10132G06T2207/30041G06V10/25G06N3/045G06F18/25G06F18/259G06F18/214
Inventor 吕建成张小菲桑永胜郑恒王坚孙亚楠贺喆南
Owner SICHUAN UNIV