Machine vision-based famous high-quality tea tender shoot identification and picking point positioning method

A machine vision and positioning method technology, applied in neural learning methods, instruments, image analysis, etc., can solve problems such as large data, complex calculation, low efficiency, etc., to achieve the effect of improving economic income, accurate extraction, and improving intelligence

Active Publication Date: 2022-07-05
HANGZHOU DIANZI UNIV
View PDF7 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] Patent document CN113674226A discloses a tea bud tip detection method based on deep learning, which mainly obtains tea bud tips based on the Yolov4 model, and then obtains the picking point coordinates through HSV image segmentation and convex hull detection. The picking point extraction The algorithm needs to wait for the output of the prediction frame to obtain the main body of the bud according to the traditional color features. It can be seen that there are more data to be obtained, the calculation is complicated, and the efficiency is low.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Machine vision-based famous high-quality tea tender shoot identification and picking point positioning method
  • Machine vision-based famous high-quality tea tender shoot identification and picking point positioning method
  • Machine vision-based famous high-quality tea tender shoot identification and picking point positioning method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] It should be noted that the embodiments of the present invention and the features of the embodiments may be combined with each other under the condition of no conflict.

[0044] In the description of the present invention, it should be understood that the terms "center", "portrait", "horizontal", "top", "bottom", "front", "rear", "left", "right", " The orientation or positional relationship indicated by vertical, horizontal, top, bottom, inner, outer, etc. is based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and The description is simplified rather than indicating or implying that the device or element referred to must have a particular orientation, be constructed and operate in a particular orientation, and therefore should not be construed as limiting the invention. In addition, the terms "first", "second", etc. are used for descriptive purposes only, and should not be constru...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a famous high-quality tea tender shoot recognition and picking point positioning method based on machine vision. The method comprises the following specific steps: step 1, making an original data set of famous high-quality tea tender shoots; step 2, training an original data set, solidifying obtained parameters into a model after training is finished, and outputting the model; step 3, acquiring a test image of famous high-quality tea tender shoots, inputting the test image into the model output in the step 2, and outputting a prediction frame and two-dimensional mask information of the tea tender shoots in the test image; step 4, acquiring areas of different connected domains in the two-dimensional mask information, calculating a minimum enclosing rectangle of a maximum connected domain, acquiring a rotation angle of the minimum enclosing rectangle, taking the rotation angle of the minimum enclosing rectangle as a bud axis direction, and acquiring a cutting angle along the tangential direction of the bud axis; 5, taking 2% of the position from bottom to top along the bud axis as a picking point. An instance segmentation algorithm is used to directly output a mask region of a tender shoot through deep features between model learning pixels, and tea bud edge extraction is accurate.

Description

technical field [0001] The invention relates to the technical field of tea picking robots, in particular to a method for identifying tender buds of famous tea and locating picking points based on machine vision. Background technique [0002] The benefits of tea in Zhejiang Province mainly come from famous tea and high-quality tea, among which famous tea refers to tea with better quality made from one bud and one leaf or one bud and two leaves. Since the current common tea picking machines are reciprocating blade tea picking, it is impossible to distinguish the types of buds and leaves, and the integrity of the buds and leaves is difficult to guarantee, which makes the quality of the picked tea low and cannot meet the requirements of famous teas, and the price also varies with the price. the reduction. Therefore, the picking of famous tea still needs to be done manually. Therefore, a picking method is urgently needed so that the machine can accurately and quickly identify t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00G06T7/12G06T7/187G06T7/62G06V10/82G06N3/04G06N3/08G06Q50/02
CPCG06T7/0002G06T7/12G06T7/187G06T7/62G06N3/08G06Q50/02G06T2207/20081G06T2207/20084G06N3/045Y02A90/10
Inventor 陈冬梅林佳闫莉婕范姗慧魏凯华
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products