Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Large-field-of-view scene perception method and system based on omnidirectional vision, medium and device

A scene perception and large field of view technology, applied in the field of large field of view scene perception, can solve the problems of low algorithm complexity and inability to effectively identify obstacles, etc., to achieve accurate perception of the surrounding environment, excellent scene understanding and robustness sexual effect

Active Publication Date: 2019-04-26
SHANDONG UNIV
View PDF16 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The inventor also found that the traditional method is based on the low-level visual information of the image pixel itself for image processing. The traditional method does not have an algorithm training stage, and the algorithm complexity is often not high. For complex environmental backgrounds, it is impossible to effectively identify obstacles, etc. valid information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Large-field-of-view scene perception method and system based on omnidirectional vision, medium and device
  • Large-field-of-view scene perception method and system based on omnidirectional vision, medium and device
  • Large-field-of-view scene perception method and system based on omnidirectional vision, medium and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] The omnidirectional vision-based large-field-of-view scene perception method of this embodiment can run on the ROS (Robot Operating System) platform.

[0036] Such as figure 1 with figure 2 As shown, the omnidirectional vision-based large field of view scene perception method of this embodiment at least includes:

[0037] S101: Construct a local grid map of all cameras surrounding the robot; the total viewing angle range of these cameras surrounding the robot can cover a 360-degree environment around the robot.

[0038] In this embodiment, six cameras are taken as an example, and six surround cameras are built on the robot komodo2, so that environmental information in 360-degree directions around the robot can be collected.

[0039] It should be noted that other numbers of cameras can also be selected, and the actual number of cameras is determined according to the actual angle of view of the cameras, so as to ensure that the 360° environment is covered.

[0040] Sp...

Embodiment 2

[0120] The omnidirectional vision-based large field of view scene perception system of this embodiment at least includes:

[0121] (1) Several cameras, these cameras are arranged around the robot, and the total viewing angle range of all cameras can cover the 360-degree environment around the robot.

[0122] (2) perception processor, described perception processor comprises:

[0123] (2.1) A local grid map construction module, which is used to construct a local grid map of all cameras surrounding the robot; the total viewing angle range of these cameras surrounding the robot can cover a 360-degree environment around the robot.

[0124] Specifically, the local grid map construction module also includes:

[0125] (2.1.1) a relationship building module, which is used to obtain the relationship between the pixel coordinates and the two-dimensional coordinates of the camera plane according to the corresponding transformation matrix of the camera;

[0126] (2.1.2) a global path pl...

Embodiment 3

[0141] This embodiment provides a computer-readable storage medium on which a computer program is stored, and it is characterized in that, when the program is executed by a processor, the following figure 1 The steps in the omnidirectional vision-based large-field-of-view scene perception method are shown.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a large-field-of-view scene perception method and system based on an omnidirectional vision, a medium and a device. The large-field-of-view scene perception method based on omnidirectional vision comprises the following steps of: receiving images transmitted by all cameras in real time, identifying obstacle information through image semantic segmentation, and further constructing local grid maps of all the cameras; wherein all the cameras are arranged around a robot, and the total visual angle range of the cameras covers a 360-degree environment around the robot; carrying out path planning by using the constructed local grid maps and controlling the operation of the robot according to the planned path. The method can collect the environmental information of 360-degree directions around the robot, makes up the defect of small visual range of a single camera, can adapt to more complex environments, and has excellent scene generalization and robustness.

Description

technical field [0001] The present disclosure belongs to the field of robot navigation and obstacle avoidance, and in particular relates to a method, system, medium and equipment for large field of view scene perception based on omnidirectional vision. Background technique [0002] The statements in this section merely provide background information related to the present disclosure and do not necessarily constitute prior art. [0003] Robot autonomous navigation and obstacle avoidance is one of the core tasks of the robot. Navigation and obstacle avoidance is a system involving the environment perception of mobile robot multi-sensor fusion and the motion control system that updates the path in real time according to a certain algorithm. Pass through existing static or dynamic obstacles, and finally reach the target point. [0004] The robot can use the map information to guide the robot to plan the global path. In the navigation process, multiple sensor information is fus...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20G05D1/02
CPCG01C21/20G05D1/0212G05D1/0231
Inventor 杨帅张伟赵仲伟邓寒谭文浩顾建军
Owner SHANDONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products