Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A vision-based robot navigation method for agricultural and forestry parks

A navigation method and robot technology, which is applied in the field of vision-based robot navigation in agricultural and forestry parks, can solve the problems of inability to accurately locate GPS and inability to turn around accurately between lines, and achieve the effects of realizing correct navigation, eliminating constraints, and simplifying navigation algorithms

Active Publication Date: 2022-02-08
INST OF AGRI RESOURCES & REGIONAL PLANNING CHINESE ACADEMY OF AGRI SCI
View PDF12 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The present invention provides a method for visual navigation and auxiliary signage guidance, aiming at the problem that the GPS cannot accurately locate the robot in the densely planted orchard and cannot accurately turn around between rows.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A vision-based robot navigation method for agricultural and forestry parks
  • A vision-based robot navigation method for agricultural and forestry parks
  • A vision-based robot navigation method for agricultural and forestry parks

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] Embodiments of the present invention are described below with reference to the drawings, in which like parts are denoted by like reference numerals. In the case of no conflict, the following embodiments and the technical features in the embodiments can be combined with each other. Below take orchard as an example to describe the present invention, but in forest area, the present invention is applicable equally.

[0022] Such as figure 1 An embodiment of the method of the present invention is shown, and the method of the present invention includes: S1, setting auxiliary signs at the entrances and exits of orchard rows in the agricultural and forestry park. S2, when the robot moves to the front of the auxiliary sign, adjust the distance and angle from the auxiliary sign, and then complete the turning.

[0023] In one embodiment, the appearance of the auxiliary signage can have a special design, such as Figure 2-Figure 5 shown. The auxiliary sign adopts a graphic with...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention proposes a vision-based navigation method for robots in agricultural and forestry parks, including: S1, setting auxiliary signs at the entrances and exits of orchard rows in agricultural and forestry parks; S2, when the robot moves to the front of the auxiliary signs, adjusting the distance and distance from the auxiliary signs angle, and complete the turn. The method of the present invention aims at the situation that the robot in the orchard cannot be positioned depending on GPS positioning, and can enable the robot to use the orchard row belt detection method and the positioning guide sign to realize the functions of the robot's precise travel, turning and finding the next row without GPS, satisfying the requirements of the robot. The need for robots to work continuously in orchards. In the orchard environment, if GPS positioning is used, the GPS will often be lost, the positioning will be inaccurate and the subsequent operating costs will be high. The invention can realize the correct navigation of the robot without GPS, and reduces the economic investment.

Description

technical field [0001] The invention relates to navigation technology, more specifically, to a vision-based navigation method for robots in agriculture and forestry parks. Background technique [0002] In today's era, multifunctional agricultural robots are widely used, making agricultural robots more and more replace manual work in the vast fields. There are many kinds of tasks in orchard production in agriculture, such as: flower thinning and fruit setting, bagging, pruning, mulching, irrigation, fertilization, spraying pesticides, pest control, staged harvesting and other processes, which require a lot of manpower and material resources. Inaccurate orchard management will generate a lot of ineffective inputs and ecological pollution, which will increase fruit prices. In response to these situations, it is imperative to develop intelligent and precise robots suitable for orchard operations. In order for the robot to replace human labor and work autonomously in the orchar...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G05D1/02
Inventor 史云李会宾吴文斌杨鹏唐华俊
Owner INST OF AGRI RESOURCES & REGIONAL PLANNING CHINESE ACADEMY OF AGRI SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products