Fruit picking robot target detection method based on deep learning in unstructured environment

A picking robot and target detection technology, which is applied in the field of target detection of fruit picking robots based on deep learning, can solve the problems of insufficient recognition accuracy, changing geometric feature growth cycle, and high computer hardware requirements, so as to reduce the difficulty of gradient disappearance and training. , the effect of reducing the amount of calculation and running time, good robustness and generality

Inactive Publication Date: 2021-01-26
CHONGQING UNIV OF POSTS & TELECOMM
View PDF4 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In view of this, the object of the present invention is to provide a target detection method for fruit picking robots based on deep learning in an unstructured environment, to solve the problem that traditional target detection is easily affected by lighting conditions, geometric features change with the growth cycle, branches and leaves block fruit and other problems. Problems such as fruit cluster growth have also overcome the shortcomings of general deep learning neural network models that require a large number of training sets, a large amount of calculation, long calculation time, high requirements for computer hardware, and insufficient recognition accuracy.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fruit picking robot target detection method based on deep learning in unstructured environment
  • Fruit picking robot target detection method based on deep learning in unstructured environment
  • Fruit picking robot target detection method based on deep learning in unstructured environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043]The embodiments of the present invention are described below through specific specific examples, and those skilled in the art can easily understand other advantages and effects of the present invention from the contents disclosed in this specification. The present invention can also be implemented or applied through other different specific embodiments, and various details in this specification can also be modified or changed based on different viewpoints and applications without departing from the spirit of the present invention. It should be noted that the drawings provided in the following embodiments are only used to illustrate the basic idea of ​​the present invention in a schematic manner, and the following embodiments and features in the embodiments can be combined with each other without conflict.

[0044] see Figure 1 to Figure 6 , the present invention optimizes a flow chart of a method for detecting the target of a tomato picking robot based on deep learning ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a fruit picking robot target detection method based on deep learning in an unstructured environment, and belongs to the technical field of intelligent agricultural production.According to the method, a Mask R-CNN is used as a target detection framework, ResNet-101 is used as a backbone network and is combined with an FPN architecture to perform target feature extraction,then a feature map output by the backbone network is sent to an RPN to generate RoI, and then the RoI output from the RPN is mapped to extract corresponding target features in a shared feature map; and finally, the features are respectively output to an FC layer and an FCN layer to carry out target detection, frame regression and instance segmentation. According to the method, the problem of low detection precision caused by illumination condition change, branch and leaf shielding, fruit clustering overlapping and the like of a traditional digital image processing technology in an unstructuredenvironment is solved, and the defects of complex structure, slow gradient disappearance, large calculation training amount, slow model convergence and the like of a common target detection neural network are also overcome.

Description

technical field [0001] The invention belongs to the technical field of intelligent agricultural production, and relates to a target recognition and detection method for a fruit and vegetable picking robot, in particular to a target detection method for a fruit picking robot based on deep learning in an unstructured environment. Background technique [0002] Since the advent of fruit and vegetable picking robots, many new methods have emerged on target detection and recognition. Traditional digital image processing technology, such as feature extraction and recognition based on color, texture, shape, etc., has the disadvantage of relying on the external color, texture, and shape of the target fruit, but the external features of the fruit will change with the different growth stages of the fruit , and is easily affected by lighting conditions or the inability to obtain complete external characteristics due to shading by branches and leaves and overlapping fruits. [0003] Wit...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32G06N3/04G06T3/40G06T7/11G06T7/73
CPCG06T3/4007G06T7/73G06T7/11G06T3/4084G06T2207/20081G06T2207/20084G06V20/10G06V10/25G06V10/255G06V20/68G06V2201/07G06N3/045
Inventor 郑太雄江明哲
Owner CHONGQING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products