Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Unmanned ship environment intelligent sensing method based on deep learning

A technology of intelligent perception and deep learning, which is applied in the field of environmental perception of unmanned ships, can solve the problems such as the inability to achieve feasible channel segmentation, and achieve the effect of taking into account real-time and accuracy

Pending Publication Date: 2020-11-06
海之韵(苏州)科技有限公司
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, the research on target recognition technology for unmanned boats continues. For example, Chinese patent CN107609601A discloses a ship target recognition method based on multi-layer convolutional neural network. This method is only applicable to multi-layer convolutional neural network to realize the Target recognition, although the target recognition accuracy of this method is high, but this method can only identify ship targets, and cannot achieve feasible channel segmentation in the direction of navigation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Unmanned ship environment intelligent sensing method based on deep learning
  • Unmanned ship environment intelligent sensing method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the drawings in the embodiments of the present invention. Obviously, the described embodiments are part of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts shall fall within the protection scope of the present invention.

[0040] A deep learning-based intelligent perception method for the environment of unmanned boats, the process of which is as follows figure 1 shown, including:

[0041] Step 1: Obtain training image dataset and test image dataset;

[0042] Step 2: Construct the environment perception model of the unmanned vehicle;

[0043] Step 3: Use the training image data set to train the UAV environment perception model;

[0044] Step 4: Use the test image data set ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to an unmanned ship environment intelligent sensing method based on deep learning. The method comprises steps of 1, acquiring a training image data set and a test image data set;2, constructing an unmanned ship environment perception model; 3, training an unmanned ship environment perception model by using the training image data set; 4, testing the precision of the unmannedship environment perception model by using the test image data set, judging whether the unmanned ship environment perception model reaches preset precision or not, if so, executing the step 5, and otherwise, returning to the step 3; and 5, acquiring a real-time image in the sight range of the unmanned surface vehicle, and inputting the real-time image into the unmanned surface vehicle environmentsensing model to perform real-time target identification, positioning and front feasible direction segmentation on the surrounding environment of the unmanned surface vehicle. Compared with the priorart, the method is advantaged in that target recognition and feasible navigation channel segmentation are achieved at the same time, the size of the input image is not restrained, and real-time performance and accuracy are both considered.

Description

technical field [0001] The invention relates to the technical field of unmanned boat environment perception, in particular to a deep learning-based method for intelligent perception of the environment of the unmanned boat. Background technique [0002] As a disruptive technology, the maritime unmanned system will occupy an important position in the future society and is a hot spot in the current industry and academia. At present, the maritime intelligent unmanned system has become an inevitable trend of industrial development, and will redefine the rules of the ship equipment industry. In the traditional shipbuilding industry, manual remote control accounts for more than 70%, requiring manual real-time control based on sensor information. In addition, due to the lag of ship control and the complexity of sea conditions, it is often necessary to have certain sailing experience to be competent for the post of captain. In some dynamic positioning scenarios that require precise ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V2201/07G06N3/045G06F18/241G06F18/214
Inventor 刘笑成邓清馨孙志坚杨子恒胡智焕张卫东
Owner 海之韵(苏州)科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products