Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Urban land utilization information analysis method based on deep neural network

A deep neural network and urban technology, applied in the field of remote sensing image processing, can solve the problem of limited accuracy of urban land use extraction, achieve the best classification performance and improve the accuracy rate.

Active Publication Date: 2018-12-07
BEIJING UNIV OF TECH +1
View PDF5 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Aiming at the problems in the prior art, the present invention provides a method for analyzing urban land use information based on a deep neural network, which solves the problem of limited extraction accuracy of urban land use using a single sensor in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Urban land utilization information analysis method based on deep neural network
  • Urban land utilization information analysis method based on deep neural network
  • Urban land utilization information analysis method based on deep neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0067] Such as Figure 1A as shown, Figure 1A A schematic flow diagram of a method for analyzing urban land use information based on a deep neural network in this embodiment is shown, and the method includes the following steps:

[0068] S1. Obtain a set of fully polarimetric SAR data from microwave remote sensing satellites and a set of multispectral images from optical remote sensing satellites, preprocess the fully polarimetric SAR data to be processed, and obtain preprocessed fully polarimetric SAR data; The optical image is preprocessed to obtain the preprocessed optical image.

[0069] In this embodiment, the full-polarization SAR data and the optical image belong to different images of the same urban land.

[0070] It should be noted that the multispectral optical image to be processed (that is, the optical image) is obtained in advance, and then the optical image is subjected to preprocessing such as atmospheric correction, geometric correction, radiometric correction...

Embodiment 2

[0106] Based on the content of the above-mentioned first embodiment, each step in the above-mentioned first embodiment will be described in detail below.

[0107] (1) Optical feature extraction

[0108] 1. Multi-spectral band information extraction

[0109] The radiation intensity information of each band of the optical image can be extracted: band1, band2 to bandn.

[0110] 2. Gray level co-occurrence matrix (GLCM) information extraction

[0111] The weighted summation of the intensity of each channel of the optical image is carried out to obtain the radiation intensity information:

[0112] A=c 1 ·band1+c 2 ·band2...c n ·bandn

[0113] where the coefficient c i selected by experience;

[0114] 3. Use gray level co-occurrence matrix to extract texture information of optical image: mean (Mean), correlation coefficient (Correlation), variance (Variance), homogeneity (Homogeneity), contrast (Contrast), difference (Dissimilarity), entropy ( Entropy) and so on.

[0115] ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an urban land utilization information analysis method based on a deep neural network. The method comprises the steps of by means of an artificial ground control point selectingmethod, registering pre-processed fully polarized SAR data and an optical image, thereby respectively extracting a polarizing characteristic and an optical characteristic; inputting the polarizing characteristic and the optical characteristic into a pre-trained SAE classifier, thereby obtaining a first primary classification result which corresponds with the optical characteristic, and a second primary classification result that corresponds with the polarizing characteristic; wherein the SAE classifier is obtained through training an architecture based on the deep neural network by means of amarked sample; and performing decision grade fusion on the first primary classification result and the second primary classification result according to a D-S evidence theory, thereby obtaining urbanland utilization classification information. The method according to the invention settles a problem of limited urban city utilization extracting precision by means of a single sensor in prior art.

Description

technical field [0001] The invention belongs to remote sensing image processing technology, in particular to a method for analyzing urban land use information based on a deep neural network. Background technique [0002] Urbanization is one of the concentrated manifestations of intense human activities changing nature, and urban land use change is one of the most important responses to the urbanization process. Land use classification reflects the forms of human land use and transformation, and reflects the inherent characteristics of land use itself. Urban land use change is a comprehensive reflection of the interaction between long-term human activities and the natural environment. It is closely related to human economic and social development, and is also closely related to natural processes such as runoff process, evaporation process, ecological process, and urban heat island effect. Urban land use change brings complex ecological and environmental consequences, affecti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/00G06N3/04G06Q50/16
CPCG06N3/04G06Q50/165G06V20/176G06F18/241G06F18/25G06F18/214
Inventor 张渊智李煜孙光民
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products