Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training method of generative model, polyp identification method and device, medium and equipment

A training method and polyp technology, applied in the field of image processing, can solve the problems of insufficient accuracy of out-of-sample data detection, unrealistic polyp detection, etc., to ensure semantic consistency, improve detection accuracy and robustness, and reduce manpower and the effect of time

Active Publication Date: 2021-10-01
BEIJING BYTEDANCE NETWORK TECH CO LTD
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the out-of-sample data has a large domain shift through the above method, there may be a large performance gap in these trained networks, and it is difficult to ensure the generalization of the model through limited sample data, making the trained model Insufficient detection accuracy on out-of-sample data for accurate polyp detection

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training method of generative model, polyp identification method and device, medium and equipment
  • Training method of generative model, polyp identification method and device, medium and equipment
  • Training method of generative model, polyp identification method and device, medium and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although certain embodiments of the present disclosure are shown in the drawings, it should be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein; A more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for exemplary purposes only, and are not intended to limit the protection scope of the present disclosure.

[0035] It should be understood that the various steps described in the method implementations of the present disclosure may be executed in different orders, and / or executed in parallel. Additionally, method embodiments may include additional steps and / or omit performing illustrated steps. The scope of the present disclosure is not limited in this respect. ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a training method of a generative model, a polyp identification method and device, a medium and equipment, and the method comprises the steps: obtaining a training sample set, each training sample in the training sample set comprises a training image and a polyp marking type corresponding to the training image; obtaining a generated image and a restored image corresponding to the training image according to the training image and an image generation model; determining a first distribution distance corresponding to the training image and the generated image according to the training image and the generated image; according to the first distribution distance, the training image, the generated image, the restored image and a polyp marking category corresponding to the training image, determining target loss of the image generation model, wherein the target loss comprises first distribution loss determined according to the first distribution distance, and the first distribution loss and the first distribution distance are in a negative correlation relationship; and updating parameters of the image generation model according to the target loss under the condition that an updating condition is met.

Description

technical field [0001] The present disclosure relates to the field of image processing, and in particular, relates to a training method for generating a model, a method for identifying polyps, a device, a medium, and equipment. Background technique [0002] Endoscopes are widely used for colon screening and polyp detection, but the detection accuracy of endoscopes largely depends on the experience of endoscopists. However, because the characteristics of polyps are difficult to identify, and many polyps are small in size, the missed detection rate of polyp detection is relatively high, which greatly increases the difficulty of early polyp screening. [0003] In related technologies, the deep learning method can be used for model training to be used in a computer-aided diagnosis system for polyp identification and segmentation. However, when the out-of-sample data has a large domain shift through the above method, there may be a large performance gap in these trained networks...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06T7/00
CPCG06T7/0012G06T2207/10068G06T2207/30032G06F18/214G06F18/241
Inventor 边成石小周杨延展
Owner BEIJING BYTEDANCE NETWORK TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products