Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method for identifying and locating fruits to be picked based on depth target detection

A technology for target detection and fruit identification, applied in the field of fruit identification and positioning to be picked, can solve the problems such as the robustness of the extraction algorithm needs to be improved, uneven illumination, etc., achieve effective and reliable identification and positioning, solve the problem of insufficient feature representation and less robust effects

Inactive Publication Date: 2019-03-01
江苏德劭信息科技有限公司
View PDF2 Cites 43 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The idea of ​​the existing algorithm is mainly to manually design and extract features for the texture, color and shape of the fruit to be picked. When the occlusion and lighting conditions change with the scene under natural conditions, the corresponding features of the fruit to be picked also Therefore, the robustness of the fruit feature extraction algorithm based on artificial feature design needs to be improved; under natural production conditions, the acquired fruit images usually have complex backgrounds such as occlusion or uneven illumination. Fruit recognition rate and positioning accuracy under growing conditions need to be improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method for identifying and locating fruits to be picked based on depth target detection
  • A method for identifying and locating fruits to be picked based on depth target detection
  • A method for identifying and locating fruits to be picked based on depth target detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0036] The specific implementation is as follows:

[0037] Step 1 Image acquisition

[0038] Use a Canon digital camera to collect images of apples in natural growth scenes. When shooting, the height of the camera is similar to that of an adult (about 1.8 meters), and the angle of the camera lens is random. During the sample shooting, keep the apple target in the middle of the collected image and keep it clear. The focal length was constant throughout the sample capture.

[0039] Step2 image annotation

[0040] Manually mark the apples in the 1000 training set sample pictures, mark the real minimum circumscribed rectangle of each apple and record the coordinate information of the two vertices of the upper left corner and the lower right corner of the rectangle.

[0041] Step3 data set preparation

[0042] For the 2,000 apple pictures collected in Step 1, 600 were manually randomly selected as the verification set, 400 were used as the test set, and the remaining 1,000 were ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for identifying and locating fruit to be picked based on depth target detection, comprising the following steps: image acquisition, image annotation. data Set Preparation. Feature extraction includes using depth convolution neural network CNN to extract the fruit features in the image first; VGG16 is used as a convolution neural network for feature extraction. Model training, For all the training samples, the feature maps on each convolution layer are extracted firstly, and the prediction boxes are generated for the extracted feature maps, the categories of thetargets in the prediction boxes are calculated, the distance between the prediction boxes and the real annotation boxes is calculated, and the joint category loss and the prediction box offset loss are used as the target loss function of the training during the training process. A training is completed based on the complete one-time calculation process described above, and the training ends whenthe number of training times reaches a predetermined threshold or the loss is less than the predetermined threshold.

Description

technical field [0001] The invention relates to a method for identifying and locating fruits to be picked, in particular to a method for identifying and locating fruits to be picked based on depth target detection. Background technique [0002] Due to the complexity of the natural environment where fruit picking operations are located, fruit picking still relies on manpower, and the labor required for fruit picking accounts for more than half of the labor input in the entire production process. With the continuous decline of the agricultural employment population and the continuous increase of labor costs in my country, automatic fruit picking is of great significance to solve the labor shortage in the fruit industry, ensure timely fruit picking, and improve picking quality. Therefore, it is imminent to study the automatic picking technology of fruits. [0003] Quickly and reliably finding the fruit target and determining the location of the fruit to be picked in the natura...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/32G06K9/62G06N3/04A01D46/30
CPCA01D46/30G06V10/245G06V2201/07G06N3/045G06F18/214
Inventor 邓杨敏李亨吕继团
Owner 江苏德劭信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products