Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Fruit visual collaborative searching method of harvesting robot

A picking robot and search method technology, applied in the field of picking robot target search, can solve the problems of poor adaptability, high cost, fixed path, etc., and achieve the effect of saving picking time

Active Publication Date: 2018-08-31
CHANGZHOU UNIV
View PDF2 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In the existing literature at home and abroad, the fruit search method is only involved in other related technical literature of picking robots. For example, Jiménez AR et al. used infrared laser ranging sensors to search and detect spherical targets in unstructured environments in the research of fruit recognition. objects, but this method has poor real-time performance and high cost
In the research of real-time obstacle avoidance algorithm, Li Zhankun and others set four different paths based on monocular vision to search for fruits, but due to the fixed path, the adaptability is not good
Generally speaking, there is still no suitable fruit search method at present, so that the picking robot can realize the effective search of fruits and further identify them.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Fruit visual collaborative searching method of harvesting robot
  • Fruit visual collaborative searching method of harvesting robot
  • Fruit visual collaborative searching method of harvesting robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0015] Embodiments of the present invention will be further described below in conjunction with the accompanying drawings. The present invention is described by taking apples as an example, but the present invention is equally applicable to other fruits.

[0016] During image object search, people tend to start paying more attention to the image salient regions and the likelihood objects in the regions. For the picking robot, a large field of view camera installed on its motion platform is used to collect the global image of the fruit trees in the orchard, such as figure 2 . When working with a large field of view camera, first rotate and search for the vertical left or right boundary of the whole area of ​​fruit in the orchard, based on the constant left or right boundary of the whole area of ​​fruit in the image, and measure by the size change of the fruit area closest to the central horizontal line (like image 3 , the black line is the boundary line, and the blue circl...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a fruit visual collaborative searching method of a harvesting robot. According to the method, a large-small dual-view-field visual acquisition system is employed; a large-view-field camera is arranged at a mobile platform of a robot and is used for carrying out global imaging on a fruit tree in a orchard, carrying out salient region extraction on the image, measuring a fruit likelihood target by using a target likelihood value, removing a small region, determining an approximate region of a fruit target in a large-view-field image, and guiding the robot to approach thefruit tree; a small-view-field camera is arranged at a mechanical arm on which an end effector of the robot is arranged and is used for establishing a large-view-field image coordinate system and a small-view-field image coordinate system as well as a mapping relationship between the view-field coordinate systems and a world coordinate system and carrying out regional imaging based on fruit targetsearch region information obtained in the large-view-field current image and the coordinate system mapping relationship so as to search for a fruit cooperatively. The method is similar to human-eye target searching; blind out-of-order searching of the robot is avoided; and the foundation is laid for follow-up precise fruit identification.

Description

technical field [0001] The invention relates to the technical field of target search for picking robots, especially the visual search for fruits. Background technique [0002] When the picking robot enters the orchard, its primary task is to search for fruits. The so-called fruit search refers to the process of searching the fruit area for further identification during the working process of the picking robot. In the existing literature at home and abroad, the fruit search method is only involved in other related technical literature of picking robots. For example, Jiménez AR et al. used infrared laser ranging sensors to search and detect spherical targets in unstructured environments in the research of fruit recognition. objects, but this method has poor real-time performance and high cost. In the research of real-time obstacle avoidance algorithm, Li Zhankun and others set four different paths based on monocular vision to search for fruits, but due to the fixed path, the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46A01D46/00
CPCA01D46/00G06V20/38G06V10/50G06V10/462
Inventor 吕继东吕小俊徐黎明邹凌杨彪戎海龙陈阳
Owner CHANGZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products