Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method, device and terminal for realizing interactive image segmentation

An image segmentation and interactive technology, applied in the field of image processing, can solve problems such as unsatisfactory segmentation effect

Active Publication Date: 2020-03-27
NUBIA TECHNOLOGY CO LTD
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The GrabCut algorithm in the related art is based on color images for image segmentation. When the color features of the target object to be extracted are not obvious, the segmentation effect of using color images for segmentation is not ideal.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method, device and terminal for realizing interactive image segmentation
  • A method, device and terminal for realizing interactive image segmentation
  • A method, device and terminal for realizing interactive image segmentation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0139] Such as image 3 As shown, the embodiment of the present invention proposes a method for realizing interactive image segmentation, including:

[0140] S310. Obtain a color map containing the color information of the target object, a depth map containing the depth information of the target object, and a mask map in which foreground points and background points are marked on the color map;

[0141] S320. Determine a first segmentation parameter for each pixel on the mask image according to the color image and the mask image, and determine a second segmentation parameter for each pixel on the mask image according to the depth image and the mask image, The first segmentation parameter and the second segmentation parameter are used to represent the probability that the pixel is determined to be a foreground point or a background point and the numerical difference between the pixel and adjacent pixels; the first segmentation parameter and the second segmentation parameter ca...

Embodiment 3

[0232] An embodiment of the present invention also provides a terminal, including the above-mentioned device for realizing interactive image segmentation.

application example 1

[0234] A method for implementing interactive image segmentation in this example includes the following steps:

[0235] Step S501, acquiring a color map containing the color information of the target object and a depth map containing the depth information of the target object;

[0236] Among them, such as Figure 5-a As shown, the original image (color map) contains the target object "stapler", and the user has scribbled on the original image, hoping to segment the target object "stapler"; Figure 5-b As shown, the depth map is a picture containing depth information, which is the same size as the color map; in the depth map, the darker part is taken at a farther distance, and the lighter part is taken at a closer distance.

[0237] Step S502, obtaining a mask image initially marked with foreground points and background points for the color image;

[0238] Among them, such as Figure 5-c As shown, the mask map is a picture initially marked with foreground points and backgroun...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and device for implementing interactive image segmentation and a terminal. The method comprises the following steps: acquiring a color map, a depth map, and a mask map that marks the foreground point and the background point of the map; determining the first segmentation parameters of respective pixels in the mask map according to the color map and the mask map, determining the second segmentation parameters of respective pixels in the mask map according to the depth map and the mask map, and fusing the two types of segmentation parameters; constructing a undirected graph, mapping the fused segmentation parameter of each pixel in the mask map into the undirected graph, processing the undirected graph according to a min-cut / max-flow algorithm to obtain a finely-segmented mask map, and segmenting an image corresponding to the foreground point in the finely-segmented mask map from the color map. The method and device can improve an image segmentation effect in combination with the depth information of the image.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to a method, device and terminal for realizing interactive image segmentation. Background technique [0002] Image segmentation refers to dividing a plane image into several unconnected regions according to its color, texture, shape and other characteristics. This is a practical basic technology in the field of image processing. Existing image segmentation techniques include threshold-based segmentation methods, edge-based segmentation methods, region-based segmentation methods, energy functional-based segmentation methods, graph theory-based segmentation methods, and so on. Among them, the GraphCut algorithm and its improved version GrabCut algorithm are well-known in the graph theory method. [0003] The GraphCut algorithm and its improved version, the GrabCut algorithm, are interactive image segmentation methods based on region annotation. The GraphCut algorith...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/194G06T7/90
Inventor 梁舟
Owner NUBIA TECHNOLOGY CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products