Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Vegetation Classification and Identification Method

A recognition method and vegetation technology, applied in character and pattern recognition, image analysis, image enhancement, etc., to achieve improved ability, high-precision surface vegetation coverage classification and recognition, and the effect of increasing accuracy

Active Publication Date: 2020-07-14
GUANGDONG POWER GRID CO LTD +1
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the existing vegetation coverage classification and extraction mostly use a single remote sensing data source, such as only high-resolution remote sensing images, airborne lidar point clouds, hyperspectral images, etc., or in other words, for the classification and identification of surface vegetation coverage , there is no feasible method that can fuse hyperspectral images and lidar point cloud data to achieve fine vegetation coverage classification and extraction

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Vegetation Classification and Identification Method
  • A Vegetation Classification and Identification Method
  • A Vegetation Classification and Identification Method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0032] Such as figure 1 As shown, a vegetation classification and recognition method for hyperspectral image and laser radar point cloud data fusion, the method includes:

[0033] S1: Data preprocessing, including lidar point cloud LiDAR (Light Detection And Ranging) preprocessing and hyperspectral image HSI (HyperSpectral Image) preprocessing;

[0034] S2: Hyperspectral image and lidar point cloud data registration, by establishing a robust feature line / surface registration primitive library, realize the precision registration between heterogeneous lidar point cloud and hyperspectral image, and uniform geocoding to A defined spatial reference system;

[0035] S3: Use the digital surface model DSM (Digital Surface Model) and digital terrain model DTM (Digital Terrain Model) generated by the lidar point cloud to generate a normalized digital surface model nDSM (Normalized Digital Surface Model);

[0036] S4: Use the hyperspectral image to calculate two spectral vegetation ind...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of remote sensing data fusion and intelligent application, that is, the fine extraction of surface vegetation coverage is realized by using multi-source sensor remote sensing data fusion. More specifically, the invention relates to a vegetation classification based on the fusion of hyperspectral images and laser radar point clouds and identification method. Aiming at the difficulty of surface vegetation coverage classification and identification, the present invention adopts a combined classification method that combines hyperspectral image and laser radar point cloud data features, and optimizes selection and combination of spatial spectral features that affect classification accuracy, thereby further improving vegetation coverage. The accuracy of coverage classification and identification increases the fineness of vegetation coverage mapping.

Description

technical field [0001] The invention relates to the field of remote sensing data fusion and intelligent application, that is, the fine extraction of surface vegetation coverage is realized by using multi-source sensor remote sensing data fusion. More specifically, the invention relates to a vegetation classification based on the fusion of hyperspectral images and laser radar point clouds and identification method. Background technique [0002] The spatial structure and distribution of surface objects are combined according to non-uniform subjective and objective laws, and the scales of different surface objects are also different. Even the spatial scale of the same type of surface objects changes with the characteristics of the spatial structure, especially the surface vegetation coverage The rules of spatial composition are more complex. Surface vegetation cover has many types, complex structures, uneven grass and forests, and has the characteristics of complex category co...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62G06T7/30
CPCG06T7/30G06T2207/10044G06T2207/10036G06T2207/20081G06V20/194G06V20/188G06F18/214G06F18/241
Inventor 陈景尚周华敏陈剑光刘明邸龙宋作强胡峰杨喆孙仝郑耀华何勇甘燕良宋海龙魏攀李名
Owner GUANGDONG POWER GRID CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products