Multi-sensor real-time fusion method for mobile robot remote takeover scene

A mobile robot and multi-sensor technology, applied to color TV parts, TV system parts, TVs, etc., can solve problems such as redundant data, lost distance information, and inability to meet application scenarios, so as to speed up transmission, The effect of removing redundant data

Active Publication Date: 2020-10-09
ZHEJIANG UNIV
View PDF10 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At this stage, the remote takeover system usually transmits multiple video stream data directly to the client for display, but this method has the following problems: 1. The sensor uses a single, only the camera is used to collect environmental information, and important distance information is lost; 2. .The amount of transmitted data is large, and there is a lot of redundant data, which has high requirements for network bandwidth and transmission speed; 3. The multi-channel video data display is not synchronized; 4. The multi-channel video stream is directly displayed, which is inconvenient for the operator to make decisions
[0004] The Chinese patent with the publication number CN108495060A proposes a real-time splicing method for high-definition video. After the method is initialized, each step of the video splicing stage is accelerated in a parallel manner, which can make the splicing frame rate of the two-way video stream It is greater than or equal to 30fps, and solves problems such as ghosting and blurring at the same time. However, this method only splices two-way video streams, and cannot handle the splicing of three or more video streams. It still cannot meet the needs of modern intelligent mobile robot application scenarios.
[0005] The Chinese patent with the publication number CN103856727B proposes a real-time splicing processing method for multi-channel vid

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-sensor real-time fusion method for mobile robot remote takeover scene
  • Multi-sensor real-time fusion method for mobile robot remote takeover scene
  • Multi-sensor real-time fusion method for mobile robot remote takeover scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] In order to describe the present invention more specifically, the technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0040] Such as figure 1 As shown, the present invention is oriented to a multi-sensor real-time fusion method for a remote takeover scene of a mobile robot, which specifically includes the following steps:

[0041] Step 1: The intelligent mobile robot uses three cameras whose positions are relatively fixed.

[0042] In the remote takeover scenario, the camera can collect visual information, but it will lose important distance information. At the same time, it performs poorly in harsh environments and weather. The laser radar can collect accurate distance information, but the resolution is low. The present invention uses Lidar and three monocular cameras make up for the lack of a single sensor. One of the cameras is facing the front, and the other two ca...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a multi-sensor real-time fusion method for a mobile robot remote takeover scene. According to the multi-sensor real-time fusion method, the laser radar and the multiple camerasare used for collecting data, the defects of a single sensor are overcome, and the laser radar and the multiple cameras are fused to generate a video stream with depth information, so that an operator knows distance information of surrounding objects. Aiming at the problems of splicing seams and ghosting in a linear fusion method, the invention provides a linear fusion method with a threshold value, so that the splicing seams and the ghosting are eliminated while the real-time performance is ensured, and the splicing effect is improved. In combination with the actual situation of an intelligent mobile robot, real-time fusion of multiple paths of high-definition video streams with depth information is completed through offline parameter updating and adoption of a specific parallelization method, the time delay of data transmission is reduced while redundant information of multiple sensors is removed, and the real-time fusion method has actual application value.

Description

technical field [0001] The invention belongs to the technical field of multi-sensor fusion, and in particular relates to a multi-sensor real-time fusion method for remote takeover scenarios of mobile robots. Background technique [0002] With the development of science and technology in recent years, intelligent mobile robots have become more and more widely used in people's lives. They can complete corresponding tasks in specific scenarios and partially replace people's labor, such as AGV sorting robots, sweeping robots and automatic robots. Driving a van and more. However, at this stage, the intelligent mobile robot has not yet reached the level of completely autonomous problem solving. When the intelligent mobile robot is in a complex scene or an unknown scene occurs, its own task decision-making system is not enough to deal with new problems. Without remote assistance, it will be uncontrollable situation, and even cause accidents and losses. The abnormality of the func...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04N5/265H04N5/262
CPCH04N5/262H04N5/265
Inventor 李红杨国青朱春林吕攀吴朝晖
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products