Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and device for realizing interactive image segmentation, and terminal

An image segmentation and interactive technology, applied in the field of image processing, can solve problems such as unsatisfactory segmentation effect, long algorithm running time, and affecting user experience

Active Publication Date: 2017-06-27
杭州味捷品牌管理集团有限公司
View PDF2 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] When using the GrabCut algorithm for image segmentation on a mobile phone, in order to reduce the complexity of the interaction, there are usually no strict requirements on how the user marks. Therefore, when the foreground points marked by the user are few, the number of iterations may be large, and the running time of the algorithm Long, affecting the user experience
On the other hand, the GrabCut algorithm in the related art is based on color images for image segmentation. When the color features of the target object to be extracted are not obvious, the segmentation effect of using color images for segmentation is not ideal.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and device for realizing interactive image segmentation, and terminal
  • Method and device for realizing interactive image segmentation, and terminal
  • Method and device for realizing interactive image segmentation, and terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 3

[0268] An embodiment of the present invention also provides a terminal, where the terminal includes the above-mentioned device for realizing interactive image segmentation.

application example 1

[0270] The user smears the target object he is interested in on the original image, and the image segmentation method in this paper is used to extract the target object, which may include the following steps:

[0271] Step S501, detecting that the user chooses to mark the target object by smearing;

[0272] For example, two buttons for marking are provided on the interface, one is "smudge" and the other is "outline". If the user clicks the "smudge" button, the smear track will be preprocessed.

[0273] Among them, smearing and outlining are two different ways to mark the target object;

[0274] Generally, smearing is to mark the inner area of ​​the target object, and outline is to mark along the outer contour of the target object;

[0275] Step S502, detecting that the user daubs on the original image;

[0276] For example, if Figure 5-a As shown in , the user has painted on the original image, and the target object is "stapler", where the original image is a color map;

...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and device for realizing interactive image segmentation, and a terminal. The method comprises steps that a geometric figure containing smearing locus on an original image is constructed and is expanded to form a mark area; an input mask artwork of an image segmentation algorithm is generated, pixels of the mark area are taken as foreground points of the mask artwork, and pixels of the original image outside the mark area are taken as background points of the mask artwork; a first segmentation parameter of each pixel is determined according to a colorful image and the mask artwork, a second segmentation parameter of each pixel is determined according to a depth image and the mask artwork, and the two types of segmentation parameters are integrated; the segmentation parameter after integration of each pixel is mapped to an undirected image, the undirected image is processed according to a minimal cut-maximal flow algorithm to acquire a mask artwork after fine segmentation, and an image corresponding to the foreground points of the mask artwork after fine segmentation is acquired through segmenting the colorful image. The method is advantaged in that the algorithm operation time is shortened, and the image segmentation effect is improved through utilizing the image depth information.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to a method, device and terminal for realizing interactive image segmentation. Background technique [0002] Image segmentation refers to dividing a plane image into several unconnected regions according to its color, texture, shape and other characteristics. This is a practical basic technology in the field of image processing. Existing image segmentation techniques include threshold-based segmentation methods, edge-based segmentation methods, region-based segmentation methods, energy functional-based segmentation methods, graph theory-based segmentation methods, and so on. Among them, the GraphCut algorithm and its improved version GrabCut algorithm are well-known in the graph theory method. [0003] The GraphCut algorithm and its improved version, the GrabCut algorithm, are interactive image segmentation methods based on region annotation. The GraphCut algorith...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/11G06T7/136G06T7/143G06T7/194
CPCG06T2207/10004G06T2207/20104G06T2207/20221
Inventor 梁舟
Owner 杭州味捷品牌管理集团有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products