Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data deep fusion image segmentation method for multispectral rescue robot

A rescue robot and multi-spectral image technology, which is applied in the field of data deep fusion image segmentation for multi-spectral rescue robots, can solve the problems of poor scene understanding and low work efficiency, and achieve shortened disaster relief time, reduced labor costs, and high accuracy Enhanced effect

Active Publication Date: 2020-08-25
吉林省森祥科技有限公司
View PDF7 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Due to the poor scene understanding ability and low work efficiency of traditional semantic segmentation methods in complex environments such as unstructured, diverse targets, irregular shapes, illumination changes, and object occlusions, in high-risk disaster relief sites such as fires, explosions, and earthquakes, in order to Accurate and efficient on-site inspections can reduce the work pressure of rescuers, and the accuracy of image semantic segmentation still needs to be further improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data deep fusion image segmentation method for multispectral rescue robot
  • Data deep fusion image segmentation method for multispectral rescue robot
  • Data deep fusion image segmentation method for multispectral rescue robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] A data deep fusion image segmentation method for multi-spectral rescue robots proposed by the present invention comprises the following steps:

[0032] Step 1. Generate target segmentation training dataset:

[0033] (1) Information collection: the rescue robot mainly uses its own camera to collect external environmental data information, and converts the collected geographical location and environmental status information into multi-spectral image data for storage;

[0034] (2) Manually labeled data set: In order to train the neural network, after data collection, the multispectral image is manually marked with labels pixel by pixel, and each label indicates the object category to which the current pixel belongs; the object categories include Standing human body, lying human body, common flammables, common explosives, radioactive devices, structures, falling rocks or common objects at disaster sites.

[0035] Step 2. Construct a U-shaped network that recognizes single-...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a data deep fusion image segmentation method for a multispectral rescue robot, and aims to further improve the precision of image semantic segmentation of the rescue robot, improve the accuracy of troubleshooting and analysis of the rescue robot in a disaster site, and perform autonomous detection work without manual command and control. The method comprises the following steps: generating a target segmentation training data set; constructing a U-shaped network for identifying single-spectrum single-target image segmentation; establishing an evidence theory-Bayesian two-stage decision fusion model for image semantic segmentation; and training to obtain a semantic segmentation model of multispectral data fusion. The evidence theory-Bayesian two-stage decision fusion model for image semantic segmentation is obviously improved in the aspects of accuracy, recall rate, accuracy and the like, the robot can autonomously check a disaster site in a complex disaster reliefenvironment, the detection accuracy and efficiency are improved, the labor cost is reduced, the disaster relief time is shortened, and casualties are reduced.

Description

technical field [0001] The invention relates to an image segmentation method, in particular to a data deep fusion image segmentation method for multi-spectral rescue robots. Background technique [0002] Earthquakes, mudslides, tsunamis and other natural disasters frequently occur around the world today, and the building structures at the disaster site collapse, resulting in huge changes in the terrain and environment. The space is small and unstable, which seriously threatens human safety. Precious lives, so how quickly and efficiently rescuers carry out rescue work is related to the lives of trapped people. With the continuous advancement of science and technology, rescue robots have begun to be used to enter some dangerous and complex disaster sites to carry out rescue, solving the problems of traditional rescue search and rescue time constraints, disaster site survivors and rescue personnel's life safety cannot be guaranteed. Rescue robots are intelligent robots that ca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/34G06K9/62G06N3/08G06N3/04
CPCG06N3/08G06V10/267G06N3/045G06F18/214G06F18/25
Inventor 赵安妮韩贵东马志刚王旭
Owner 吉林省森祥科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products