Check patentability & draft patents in minutes with Patsnap Eureka AI!

Complex environment target segmentation method and device based on multi-module convolutional neural network

A convolutional neural network and object segmentation technology, applied in the field of object segmentation in complex environments, can solve problems such as difficult image correction, occlusion, and limited application of segmentation technology, so as to improve segmentation robustness, enhance robustness, and ensure accuracy and real-time effects

Pending Publication Date: 2021-07-27
INST OF INTELLIGENT MFG GUANGDONG ACAD OF SCI
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this problem can be alleviated to some extent by data enhancement, it is difficult to effectively perform image correction in an adaptive way
2) Ubiquitous occlusion phenomenon
Target objects are usually occluded by other objects, and incomplete contour information and local texture features pose challenges to existing target segmentation methods, which may cause a complete target to be presented as multiple independent individuals in the visual system, thus being misrecognized For multiple target objects, thereby reducing the accuracy of vision-based target instance segmentation, which severely limits the application of segmentation technology in real scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Complex environment target segmentation method and device based on multi-module convolutional neural network
  • Complex environment target segmentation method and device based on multi-module convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0046] see figure 1 , figure 1 It is a schematic flow chart of a complex environment object segmentation method based on a multi-module convolutional neural network in an embodiment of the present invention.

[0047] Such as figure 1 As shown, a complex environment target segmentation method based on multi-module convolutional neural network, the method includes:

[0048] S11: Collect target image data in complex scenes and perform data labeling processing to obtain the marked target image data;

[0049] In the specific implementation process of the present invention, the acquisition of target image data in complex scenes and data labeling processing to obtain the marked target image data include: collecting target image data in different regions and in different scenes; The target image data in different regions and in different scenes are labeled with the image labeling tool Labelme to obtain the labeled target image data.

[0050]Specifically, in the embodiment of the p...

Embodiment 2

[0068] see figure 2 , figure 2 It is a schematic diagram of the structural composition of the complex environment target segmentation device based on the multi-module convolutional neural network in the embodiment of the present invention.

[0069] Such as figure 2 Shown, a kind of complex environment target segmentation device based on multi-module convolutional neural network, said device comprises:

[0070] Labeling module 21: used to collect target image data in complex scenes and perform data labeling processing to obtain labeled target image data;

[0071] In the specific implementation process of the present invention, the acquisition of target image data in complex scenes and data labeling processing to obtain the marked target image data include: collecting target image data in different regions and in different scenes; The target image data in different regions and in different scenes are labeled with the image labeling tool Labelme to obtain the labeled target...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a complex environment target segmentation method and device based on a multi-module convolutional neural network, and the method comprises the steps: collecting target image data in a complex scene, carrying out the data annotation, and obtaining the annotated target image data; performing data enhancement on the labeled target image data and establishing a data set to obtain a target image data set; performing target preliminary detection processing on target image data in the target image data set based on a convolutional neural network model to obtain a preliminary detection result; and inputting the preliminary detection result into a segmented region shape complementation module, and performing prediction correction on a shielded target through a puzzle algorithm to obtain a segmentation result. According to the embodiment of the invention, the method achieves the precise segmentation of a target in a natural complex environment, and guarantees the segmentation accuracy and real-time performance.

Description

technical field [0001] The present invention relates to the technical field of computer vision, in particular to a method and device for segmenting objects in complex environments based on multi-module convolutional neural networks. Background technique [0002] With the rapid development of computer and automatic control technology, the target instance segmentation technology in the field of computer vision is widely used in all walks of life. It completes pixel-level segmentation on the basis of target detection, which can better meet the needs of daily people's production and life. [0003] However, existing methods usually perform poorly in complex natural environments, mainly due to the following two reasons: 1) Variations in illumination. In natural environments, the intensity and angle of illumination are constantly changing, which leads to changes in the data distribution of images, which affects the corresponding feature extraction. Although this problem can be all...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/11G06T7/181G06T5/00G06N3/04G06N3/08
CPCG06T7/11G06T7/181G06N3/08G06T2207/20081G06T2207/20084G06T2207/20132G06T2207/10004G06N3/045G06T5/90
Inventor 雷欢焦泽昱黄丹黄凯陈再励马敬奇王楠钟震宇
Owner INST OF INTELLIGENT MFG GUANGDONG ACAD OF SCI
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More