Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Stumpage diameter at breast height measuring method based on machine vision and deep learning

A technology of deep learning and machine vision, applied in the field of machine vision, can solve problems such as difficult to carry, cumbersome steps to use, and difficult to promote, and achieve the effects of improving efficiency, improving accuracy, and reducing errors

Pending Publication Date: 2022-01-04
GUANGXI UNIV
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

High measurement accuracy can be obtained by using these instruments, but they are either bulky and difficult to carry, which limits the environment and scope of their work, or are expensive and difficult to promote on a large scale
At the same time, the operation of these instruments is complicated, the use steps are cumbersome, the requirements for measuring personnel are high, and they lack universality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Stumpage diameter at breast height measuring method based on machine vision and deep learning
  • Stumpage diameter at breast height measuring method based on machine vision and deep learning
  • Stumpage diameter at breast height measuring method based on machine vision and deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The principles and features of the present invention are described below in conjunction with the accompanying drawings, and the examples given are only used to explain the present invention, and are not intended to limit the scope of the present invention. In the following paragraphs the invention is described more specifically by way of example with reference to the accompanying drawings. Advantages and features of the present invention will be apparent from the following description and claims. It should be noted that all the drawings are in a very simplified form and use imprecise scales, and are only used to facilitate and clearly assist the purpose of illustrating the embodiments of the present invention.

[0035] see Figure 1-11 , in an embodiment of the present invention, a standing tree diameter measurement method based on machine vision and deep learning, comprising the following steps:

[0036] The first step is camera calibration and image correction. The ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a stumpage diameter at breast height measuring method based on machine vision and deep learning. The method comprises the following steps: 1, camera calibration and image correction: a camera calibration and image correction module receives a plurality of calibration plate pictures shot by a camera, carries out the calibration of the camera, obtains internal and external parameters of the camera and distortion coefficients, and carries out distortion correction, by using the parameters and the distortion coefficients, on a to-be-measured picture; step 2, trunk image segmentation: the corrected picture enters a trunk image segmentation module, the trunk image segmentation module adopts a U-Net network for segmentation, the U-Net network is used for trunk extraction, and a segmented picture is obtained. A stumpage diameter at breast height can be accurately, rapidly, and conveniently measured, so that the efficiency of forestry resource investigation is improved, and rapid development of intelligent forestry is promoted.

Description

technical field [0001] The invention relates to the technical field of machine vision, in particular to a method for measuring the diameter at breast height of standing trees based on machine vision and deep learning. Background technique [0002] At present, the measurement methods of DBH can be divided into two categories according to the tools used: manual measurement methods using traditional tools such as calipers and diameter tapes; precision instrument measurement methods using tools such as total stations and theodolites. [0003] The manual measurement method using traditional tools is greatly affected by the environment, and the measurement range is limited. For tree trunks with a relatively large diameter at breast height, it may even be impossible to measure. At the same time, the measurement data is entered manually, which also greatly affects the measurement. speed. In addition to this traditional method, some high-precision instruments can also be used for me...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/80G06T7/11G06T7/62G06N3/04G06N3/08G01B11/00G01B11/02G01B11/08G01B11/24
CPCG06T7/80G06T7/11G06T7/62G06N3/08G01B11/08G01B11/005G01B11/02G01B11/24G06T2207/10004G06T2207/20081G06T2207/20084G06T2207/30208G06N3/045
Inventor 黄汝维华蓓雷晨阳
Owner GUANGXI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products