Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Forest unstructured scene segmentation method based on multispectral image fusion

A multi-spectral image, unstructured technology, applied in the direction of graphic image conversion, image data processing, neural learning methods, etc., can solve the problem that RGB images are difficult to apply to complex unstructured scenes, and achieve poor network adaptability and error. Segmentation problems, optimizing the feature extraction process, the effect of improving accuracy and robustness

Pending Publication Date: 2022-07-29
SOUTHEAST UNIV
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the above problems, the present invention discloses a forest unstructured scene segmentation method based on multispectral image fusion, which effectively solves the problem that most current segmentation methods based on RGB images are difficult to apply to complex unstructured scenes. Using different perception methods to construct feature descriptions of the same scene, further improving the accuracy and robustness of forest unstructured scene segmentation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Forest unstructured scene segmentation method based on multispectral image fusion
  • Forest unstructured scene segmentation method based on multispectral image fusion
  • Forest unstructured scene segmentation method based on multispectral image fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0063] The technical solutions provided by the present invention will be described in detail below with reference to specific embodiments. It should be understood that the following specific embodiments are only used to illustrate the present invention and not to limit the scope of the present invention.

[0064] The invention discloses a forest unstructured scene segmentation method based on multispectral image fusion. The method designs a parallel encoder and a two-level fusion strategy, and realizes layered feature fusion in the process of encoding and decoding. First in the encoder stage, two encoding branches process RGB data and EVI data respectively, and fuse two-way features at each layer to obtain informative complementary features. The encoder also utilizes dilated convolutions to increase the receptive field of the network and optimize the feature extraction process. Then, the high-semantic features are decoded. During the decoding process, the low-semantic and high...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a forest unstructured scene segmentation method based on multispectral image fusion, and the method comprises the steps: introducing enhanced vegetation index (EVI) data, designing a parallel double-coding structure to extract the features of RGB and EVI for the conditions of complex background, multiple light rays, shadow interference and the like of a forest unstructured scene, and carrying out the segmentation of the forest unstructured scene. And multi-mode complementary features are formed in the encoding process. In addition, in the encoding stage, the receptive field of the network is increased by using expansion convolution, and the feature extraction process is optimized. And carrying out secondary fusion on the decoding features and fusion features generated by coding to obtain a multispectral fusion convolutional neural network, training the network, and inputting RGB and EVI images to realize semantic segmentation of the forest unstructured scene. The method effectively solves the problem that the unstructured segmentation method based on the RGB image is easy to have poor adaptability and wrong segmentation, and improves the accuracy and robustness of the semantic segmentation of the forest unstructured scene.

Description

technical field [0001] The invention belongs to the technical field of computer vision and unmanned vehicle environment perception, and relates to an unstructured scene segmentation method, in particular to a forest unstructured scene segmentation method based on multispectral image fusion. Background technique [0002] In recent years, autonomous driving technology has developed rapidly, and it has become a hot field for scholars from all walks of life to study and solve transportation problems. In the process of driving, providing comprehensive, accurate and reliable perception for autonomous vehicles is the premise of realizing safe and reliable autonomous driving. Among them, semantic segmentation technology has been widely studied as an important part of autonomous driving scene perception technology. At present, in the field of autonomous vehicle perception, the more mature visual semantic segmentation technology is mainly designed for structured road environments, su...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V10/26G06V10/44G06V10/82G06N3/04G06N3/08G06T3/40
CPCG06N3/08G06T3/4007G06N3/045
Inventor 李旭郭志峰徐启敏刘锡祥朱建潇
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products