High-precision farmland vegetation information extraction method

An information extraction and high-precision technology, applied in the field of agricultural remote sensing, can solve the problems of unfavorable target feature extraction and monitoring, low precision, uneven intensity, etc., achieve accurate crop information extraction results, improve accuracy, and reduce unevenness.

Pending Publication Date: 2022-01-11
GUANGDONG OCEAN UNIVERSITY
View PDF0 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In fact, due to the influence of factors such as the inhomogeneity of the light source, the inconsistency of the pixel response, the dark current bias, or some factors in the optical path that affect imaging (such as dust on the surface of the detector), the UAV’s For a target with uniform grayscale, the camera may output an image with uneven intensity, which is not conducive to the extraction and monitoring of target features in subsequent image processing, making the accuracy of farmland vegetation information extraction low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-precision farmland vegetation information extraction method
  • High-precision farmland vegetation information extraction method
  • High-precision farmland vegetation information extraction method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0036] Embodiment: a kind of high-precision farmland vegetation information extraction method, such as figure 1 As shown, it specifically includes the following steps:

[0037] S1. Collect data, use drones to collect original images of farmland.

[0038] S2. Perform flat-field correction processing on the original farmland image collected in step S1.

[0039] S3. Import the image data processed by the flat-field correction in step S2 into Pix4Dmapper or ENVI software, and perform image splicing and cropping according to the research area.

[0040] S4. Use the visual interpretation method to calibrate the label of the image data, and preprocess it through data enhancement and image cutting to formulate a neural network data set.

[0041] S5. Construct and train the Unet neural network model, substitute the neural network data set into the Unet neural network model for training, adjust the hyperparameters according to the training results, and perform accuracy evaluation to ob...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a high-precision farmland vegetation information extraction method, and relates to the technical field of agricultural remote sensing, and the technical scheme is characterized in that the method comprises the following steps: S1, collecting an original farmland image; S2, performing flat field correction processing on the farmland original image; S3, importing the image data after the flat field correction processing into Pix4Dmapper or ENVI software, and carrying out the image splicing and cutting; S4, labeling the image data, and formulating a neural network data set through a preprocessing mode of data enhancement and image cutting; S5, constructing and training a Unet neural network model; S6, saving the model, inputting a farmland image to be identified into the saved model, and obtaining an extracted farmland vegetation texture and spatial distribution information result. According to the method, the texture and spatial distribution information of the crops can be quickly and accurately obtained from the remote sensing image of the farmland, the problems of complex manual screening features and low recognition precision existing in satellite remote sensing interpretation of the farmland image are solved, and a reference method is provided for remote sensing interpretation of farmland crop information by an unmanned aerial vehicle.

Description

technical field [0001] The invention relates to the technical field of agricultural remote sensing, more specifically, it relates to a method for extracting farmland vegetation information with high precision. Background technique [0002] With the development of information technology and industry, remote sensing information has been widely used in the field of agriculture, such as crop classification, disaster prediction, cultivated land monitoring, yield prediction, etc. Traditional information acquisition and research on farmland crops mainly include manual feature selection, machine learning methods such as support vector machines, but they have certain limitations. In recent years, deep learning semantic segmentation has made great breakthroughs in the field of image classification, and has obvious advantages over manual feature classification methods. Scientific analysis and decision-making of agricultural production through the collection of farmland data by drones ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06V20/13G06V10/40G06V10/764
CPCG06F18/241
Inventor 刘大召刘秋斌郭碧峰李卓
Owner GUANGDONG OCEAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products