Hierarchical feature fusion method for multi-target detection of mobile robot

A mobile robot and target detection algorithm technology, which is applied in the field of environmental perception of mobile robots, can solve the problems of low detection ability and insufficient feature extraction, and achieve the effect of improving detection ability and efficiency

Pending Publication Date: 2021-02-05
BEIJING UNIV OF TECH
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem that the existing technology only uses a single-scale convolution kernel to extract features that are not rich, and the detection ability of objects of different scales in the same scene is low, the present invention uses dilated convolutions with different expansion rates to simulate receptive fields of different sizes, thereby Features of different scales are extracted, and at the same time, a hierarchical feature fusion method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hierarchical feature fusion method for multi-target detection of mobile robot
  • Hierarchical feature fusion method for multi-target detection of mobile robot
  • Hierarchical feature fusion method for multi-target detection of mobile robot

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0029]The embodiments of the present invention will be described in further detail below with reference to the accompanying drawings.

[0030]As attachedfigure 1 As shown, the present invention is a hierarchical feature fusion method for multi-target detection of mobile robots, which includes the following steps:

[0031]Step 1: Obtain the feature maps initially, and input the images in the data set into the pre-trained and improved VGG-16. The VGG-16 network structure is shown in Figure 2(a): It consists of 13 convolutional layers and 3 fully connected layers. The convolutional layers are Conv1_1, Conv1_2, Conv2_1, Conv2_2, Conv3_1, Conv3_2, Conv3_3, Conv4_1, Conv4_2. , Conv4_3, Conv5_1, Conv5_2, Conv5_3, the fully connected layers are FC6, FC7, FC8; the improved VGG-16 network structure is shown in Figure 2(b): Change the FC6 and FC7 fully connected layers of the VGG-16 network Is the convolutional layer; the feature map T initially obtained in step 11It is the output of the convolution...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the field of environmental perception of mobile robots, in particular to a hierarchical feature fusion method for multi-target detection of a mobile robot, and aims to improvethe detection capability of a target detection algorithm for targets of different scales so as to improve the environmental perception capability of an intelligent robot. The method includes: inputting the images in the data set into a pre-trained improved VGG-16, and preliminarily obtaining a feature map; respectively inputting the preliminarily acquired feature maps into a cavity convolution pyramid structure which comprises three cavity convolution integral branches with different expansion rates and is used for matching targets with different scales acquired by a visual sensor when the robot moves; fusing the feature maps acquired by different branches in a layered superposition mode provided by the invention, so that all channels in the feature maps contain feature information of different scales; gradually convolution is carried out on the fused feature maps to obtain feature maps of different sizes; and finally, obtaining the category and the bounding box of the to-be-detectedobject.

Description

technical field [0001] The invention relates to the field of environment perception of mobile robots, in particular to a layered feature fusion method for multi-target detection of mobile robots. Background technique [0002] With the continuous expansion of the application range of intelligent robots in the home environment, people have put forward higher and higher requirements for the robot's environmental perception ability. In the process of the robot searching for objects, since there are often objects of different scales in the robot's visual sensor, and the existing target detection algorithms cannot detect these objects well, it is necessary to improve the target detection algorithm's ability to detect objects of different scales. Detection ability, thereby improving the environment perception ability of intelligent robots. [0003] In order to enhance the detection effect of the network on targets of different scales, many scholars have improved the two-stage targ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V2201/07G06N3/045G06F18/253
Inventor 杨金福袁帅李明爱王康
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products