Wireless capsule endoscope video saliency detection method based on attention mechanism

A technology of capsule endoscope and detection method, which is applied in the field of image processing, can solve the problems of the complex environment of the digestive tract and the inability to locate a significant area, and achieves the effect of overcoming rapid positioning

Active Publication Date: 2019-09-27
HARBIN INST OF TECH
View PDF11 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to solve the problem that the manual inspection of WCE video is easily affected by the complex environment of the digestive tra

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Wireless capsule endoscope video saliency detection method based on attention mechanism
  • Wireless capsule endoscope video saliency detection method based on attention mechanism
  • Wireless capsule endoscope video saliency detection method based on attention mechanism

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0022] Specific implementation mode one: as Figure 1 to Figure 4 As shown, the wireless capsule endoscope video saliency detection method based on the attention mechanism described in this embodiment comprises the following steps:

[0023] Step 1. Obtain complete wireless capsule endoscope video image data, and screen the acquired video image data frame by frame to obtain all valid frame images in the video image data;

[0024] Normal frames among valid frames are marked as category 0, frames containing abnormal regions are marked as category 1, and the total number of image types is 2;

[0025] Step 2, converting all effective frame images obtained in step 1 into HSV (Hue, Saturation, Value) mode images, and performing denoising processing on the converted HSV mode images to obtain denoising processed images;

[0026] Performing color space transformation on the image after denoising processing to obtain an image after color space transformation;

[0027] Step 3, select im...

specific Embodiment approach 2

[0044] Specific embodiment 2: The difference between this embodiment and specific embodiment 1 is that the acquired video image data is screened frame by frame to obtain all valid frame images in the video image data. The specific process is as follows:

[0045] According to the difference in the amount of information contained in the effective frame image and the invalid frame image in the wireless capsule endoscope video image data, the image information amount representation method, such as image entropy, is used to analyze the entire video impact data; the threshold method can be directly used, through Experimental verification obtains an optimal image entropy threshold;

[0046] The frames containing information amount greater than the image entropy threshold in the video image data are screened out as effective frame images, and all effective frame images form each effective frame sequence.

specific Embodiment approach 3

[0047] Specific embodiment three: the difference between this embodiment and specific embodiment one is: the method adopted for denoising the converted HSV mode image is: mean value filtering, Gaussian smoothing filtering, Laplacian filtering, 3D Box filtering or 3D median filtering.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a wireless capsule endoscope video saliency detection method based on an attention mechanism, and belongs to the technical field of image processing. The method solves the problem that the WCE video is checked manually and is easily influenced by the complex environment of the alimentary canal, so that the salient region cannot be quickly positioned. According to the invention, video image data in the alimentary canal are obtained through a capsule endoscope imaging technology. After the video is preprocessed, the CNN classification model and the LSTM segmentation model are trained respectively, the CNN classification model and the LSTM segmentation model complement each other and are optimized, the saliency detection result of the image in the WCE video can be rapidly obtained, and the defect that the saliency area cannot be rapidly positioned in a manual mode is overcome. The method can be applied to the technical field of image processing.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a wireless capsule endoscope video saliency detection method. Background technique [0002] Due to the non-invasive and convenient operation of Wireless Capsule Endoscopy (WCE) technology, using WCE to examine the digestive tract has become the most common method for small intestine examination. In addition, in recent years, with the development of imaging technology, the image capture speed of WCE has become faster and wider, and the image capture angle has become wider and wider, and a large number of video frames can be obtained for each examination of each patient. [0003] Today, the clinical diagnosis of WCE video by doctors mainly relies on manual acquisition of effective frames from the obtained complete WCE video, and then further analysis and diagnosis of the screened effective frames. Among them, the acquisition of effective frames of WCE video ne...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06T7/00G06T5/00G06K9/46
CPCG06T7/0012G06T5/002G06T2207/10068G06T2207/20081G06T2207/20084G06V10/462G06V2201/03
Inventor 王宽全李佳欣骆功宁王立国庄丽维
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products