Turbulence degradation image semantic segmentation method based on boundary perception and adversarial learning

A degenerate image and semantic segmentation technology, applied in neural learning methods, character and pattern recognition, instruments, etc., can solve the problems of difficult to express texture information, difficult semantic segmentation, lack of data sets, etc., to improve high-level inconsistency, improve Semantic segmentation effect, improve the effect of rough segmentation results

Inactive Publication Date: 2020-06-19
BEIHANG UNIV
View PDF1 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there are few methods for semantic segmentation of turbulent degraded images. The current research difficulties mainly lie in the following aspects: (1) Turbulent degraded images have two characteristics: blur and distortion. Compared with conventional images, the image quality is lower. poor, blurred edges, object distortion, low contrast, difficult to express texture information, and often contains noise, it is difficult to perform accurate semantic segmentation; Double challenges, not only need to use global information to solve the problem of semantic discrimination, but also need to use local information to solve the problem of detail positioning; (3) At present, the data sets for semantic segmentation of turbulent degraded images are still very scarce, and turbulent flow needs to be manually obtained and established Degraded image datasets are more difficult; (4) Existing semantic segmentation methods based on deep learning generally only target higher-quality images of common scenes, and are not suitable for turbulent degraded images

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Turbulence degradation image semantic segmentation method based on boundary perception and adversarial learning
  • Turbulence degradation image semantic segmentation method based on boundary perception and adversarial learning
  • Turbulence degradation image semantic segmentation method based on boundary perception and adversarial learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0048] The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only part of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0049] Such as figure 1 Shown, the concrete realization steps of the present invention are as follows:

[0050] Step 1. Combining the turbulent physical imaging model with image processing algorithms (ie, image interpolation and image convolution methods). Using the atmospheric turbulence physical imaging model, the turbulence degradation simulation of the image is carried out, and the semantic segmentation dataset of the turbulence degradation i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a turbulence degradation image semantic segmentation method based on boundary perception and adversarial learning, and the method comprises the following steps: (1), combininga simulated turbulence-degraded image with a real turbulence-degraded image based on a turbulence imaging physical model for a turbulence-degraded image, and constructing a turbulence-degraded imagesemantic segmentation data set; (2) for the data set obtained in the step (1), constructing a DeepLabV3 + semantic segmentation model based on boundary perception in combination with fuzzy and distortion characteristics of a turbulence degradation image; (3) taking the DeepLabV3 + semantic segmentation model based on boundary perception in the step (2) as a generator, and constructing a boundary perception generative adversarial network GAN model based on adversarial learning in combination with a discriminator composed of five convolutional layers; and (4) for the GAN model obtained in the step (3), performing model training on the turbulence degradation image semantic segmentation data set obtained in the step (1) to obtain a trained semantic segmentation GAN model, and performing semantic segmentation on the turbulence degradation image by using the trained semantic segmentation adversarial network GAN model to obtain a prediction segmentation graph.

Description

technical field [0001] The present invention relates to a method for semantic segmentation of turbulent degraded images based on boundary perception and confrontation learning, which is a deep model that combines boundary perception algorithms and generative confrontation networks (GAN), and is suitable for semantic segmentation of images degraded by atmospheric turbulence Task. Background technique [0002] Semantic segmentation is widely used in various industries such as intelligent driving, security monitoring, and industrial inspection, and is a very challenging task in the field of computer vision. Semantic segmentation of turbulent degraded images refers to the task of pixel-level classification of degraded images affected by atmospheric turbulence. Atmospheric turbulence will have a serious impact on the imaging performance of the optical system, causing degradation phenomena such as distortion and blurring of the observed image, thereby reducing the accuracy of the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06N3/04G06N3/08
CPCG06N3/08G06V20/13G06V10/44G06N3/045
Inventor 崔林艳张妍
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products