Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Interactive image segmentation method and interactive image segmentation system based on focusing on mistakenly segmented regions

A technology of image segmentation and mis-segmentation, applied in the field of computer vision and image processing, can solve problems such as no explanation or report found, poor performance, inaccuracy, etc.

Pending Publication Date: 2021-02-26
SHANGHAI JIAO TONG UNIV
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] 1. This method determines whether the pixels belong to the same class based on the similarity of pixel gray values, but in fact, there are often certain pixel differences between mis-segmented and correct-segmented regions, that is, a large intra-class distance, so only based on pixel gray It is inaccurate to judge whether they belong to the same category by using degree value;
[0007] 2. This method is based on the traditional image segmentation algorithm, and its performance is worse than that of the deep learning segmentation method
[0009] 1. This method is based on the statistical distribution of pixel gray values ​​in the mis-segmented area to determine which type of pixel it is. It is inaccurate to judge whether it is the same class according to the distribution of pixel gray value;
[0010] 2. This method is based on the traditional image segmentation algorithm, and its performance is worse than that of the deep learning segmentation method
[0011] Therefore, the image segmentation method in the prior art still has problems such as poor precision, does not find the explanation or the report of similar technology with the present invention at present, also does not collect the similar data at home and abroad yet

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Interactive image segmentation method and interactive image segmentation system based on focusing on mistakenly segmented regions
  • Interactive image segmentation method and interactive image segmentation system based on focusing on mistakenly segmented regions
  • Interactive image segmentation method and interactive image segmentation system based on focusing on mistakenly segmented regions

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0071] The present invention will be described in detail below in conjunction with specific embodiments. The following examples will help those skilled in the art to further understand the present invention, but do not limit the present invention in any form. It should be noted that those skilled in the art can make several modifications and improvements without departing from the concept of the present invention. These all belong to the protection scope of the present invention.

[0072] figure 1 It is a flowchart of an interactive image segmentation method based on focusing on mis-segmented regions in an embodiment of the present invention.

[0073] Such as figure 1 As shown, the interactive image segmentation method based on focusing on mis-segmented regions provided by this embodiment may include the following steps:

[0074] S100a. Initially segment the input image to obtain an initial segmented image with poor accuracy, and perform a foreground and background matti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an interactive image segmentation method and an interactive image segmentation system based on a concentrated mistaken segmentation area, comprising the following steps of: carrying out foreground and background matting processing on initial segmentation of an input image to obtain foreground and background matting images; generating an under-segmentation and over-segmentation geodesic distance guide map for the input image and the under-segmentation and over-segmentation indication points; according to the input image, the initial segmentation image and the under-segmentation and over-segmentation geodesic distance guide image, extracting the characteristics of the whole image; extracting under-segmentation and over-segmentation region features according to the background and foreground matting images and the under-segmentation and over-segmentation indication points; and performing feature fusion on the under-segmentation and over-segmentation region features and the full-image features to obtain a corrected segmentation image. According to the method, the priori knowledge and the learning ability of the neural network are combined, the accuracy and interpretability of image segmentation are improved, the method serves as a means for obtaining segmentation data annotation, annotation can be completed only through several times of click interaction, andpixel-by-pixel annotation is avoided.

Description

technical field [0001] The present invention relates to a method in the fields of computer vision and image processing, in particular to an interactive image segmentation method and system based on focusing on mis-segmented regions. Background technique [0002] In recent years, with the rapid development of computer vision technology, semantic segmentation task, as an important branch of vision tasks, has been widely studied. Among them, image segmentation is an important part of computer-aided diagnosis and surgical operation planning. Manual segmentation by experts can guarantee the quality of labeling, but it is tedious and time-consuming. Automatic segmentation such as U-networks has achieved remarkable progress, but is still not good enough in terms of accuracy and performance robustness for clinical applications. Interactive segmentation exploits the user's knowledge in an interactive way to overcome the challenges faced by manual and automatic methods, and achieve ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/20G06K9/34G06N3/04
CPCG06V10/235G06V10/267G06N3/045
Inventor 张小云胡伟峰姚小芬郑州钟玉敏王晓霞张娅王延峰
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products