Check patentability & draft patents in minutes with Patsnap Eureka AI!

Visual perception device and method for ship navigation environment

A visual perception and environment technology, applied in neural learning methods, instruments, biological neural network models, etc., can solve problems such as difficult to accurately identify complex water surface navigation environments, improve stability and accuracy, and enrich data for network training Reliable and achieve the effect of autonomous navigation

Pending Publication Date: 2021-11-26
WUHAN UNIV OF TECH
View PDF6 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of this, it is necessary to provide a visual perception device and method for a ship's navigation environment to solve the problem that it is difficult to accurately identify complex water surface navigation environments in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual perception device and method for ship navigation environment
  • Visual perception device and method for ship navigation environment
  • Visual perception device and method for ship navigation environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Preferred embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings, wherein the accompanying drawings constitute a part of the application and together with the embodiments of the present invention are used to explain the principle of the present invention and are not intended to limit the scope of the present invention.

[0043] In the description of the present invention, the terms "first" and "second" are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly specifying the quantity of indicated technical features. Thus, the features defined as "first" and "second" may explicitly or implicitly include at least one of these features. In addition, "plurality" means at least two, such as two, three, etc., unless otherwise clearly and specifically defined.

[0044] In the description of the present invention, reference to "an embodiment" means tha...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a visual perception device and method for a ship navigation environment. The method comprises the following steps: acquiring a multi-frame fused polarization image, a radar data graph of the ship navigation environment and an RGB (Red, Green and Blue) image; performing information fusion on the fused polarization image, the radar data graph and the RGB image, inputting the fused polarization image, the radar data graph and the RGB image into an improved condition generative adversarial network for data enhancement, and generating fused enhanced data; and inputting the fusion enhancement data into a multi-scale convolutional neural network for scene segmentation identification, generating a scene segmentation map, and identifying different objects in the ship navigation environment. According to the method, scene image features are acquired in a multi-scale and multi-mode manner, visual features of different scenes on water are captured, data fusion is performed based on radar and vision to improve the stability and accuracy of a visual detection result, a data enhancement method and a multi-scale convolutional neural network are adopted, the effectiveness of network identification and classification is ensured, and autonomous navigation of the intelligent ship is realized.

Description

technical field [0001] The invention relates to the technical field of autonomous navigation, in particular to a visual perception device and method for a ship's navigation environment. Background technique [0002] Traditional image semantic segmentation methods mainly include pixel-level threshold method, segmentation method based on pixel clustering and segmentation method based on graph theory partition. It mainly relies on the low-dimensional visual features of the image for segmentation. Based on visual features such as color, texture, and edge, some feature extraction algorithms are used to extract visual information such as edge features and texture of objects in the image, and then according to these low-level visual features. The regions and objects in the image are segmented. For example, commonly used image features include directional gradient histogram features, SIFT features, SURF features, local binary features (LBP), Gabor features, etc. [0003] The method...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/34G06K9/46G06K9/62G06N3/04G06N3/08
CPCG06N3/04G06N3/08G06F18/2415G06F18/25
Inventor 肖长诗陈芊芊文元桥周春辉陈华龙
Owner WUHAN UNIV OF TECH
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More