Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target detection and visual positioning method based on YOLO series

A technology for visual positioning and target detection, applied in neural learning methods, character and pattern recognition, image data processing, etc., can solve the problems of low missed detection rate, low applicability, and low error rate.

Pending Publication Date: 2021-06-08
SOUTH CHINA UNIV OF TECH
View PDF6 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In recent years, as the technology of deep learning has become more and more mature, the update speed of the target detection model is also accelerating. The existing target detection models can be divided into two categories, one-stage detection algorithm, such as SSD, YOLO, etc. ;Because it does not require the region proposal stage, it can directly generate the category probability and position coordinate value of the object, and the final detection result can be directly obtained after a single detection, which is characterized by faster detection speed; the other type is two-stage Detection algorithms, such as Faster-RCNN, Fast-RCNN, etc.; this type of detection algorithm divides the detection problem into two stages, first generating candidate areas, and then classifying the candidate areas, which is characterized by low error rate and low missed detection rate, but the speed relatively slow
[0003] Existing target detection and visual positioning systems based on deep learning, such as the Chinese patent application "Target Detection and Positioning Method Based on YOLOv3 and OpenCV (CN111563458A)", can only use the YOLOv3 algorithm of the YOLO series for target detection, and the applicability is low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target detection and visual positioning method based on YOLO series
  • Target detection and visual positioning method based on YOLO series
  • Target detection and visual positioning method based on YOLO series

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] The present invention will be further described below in conjunction with the accompanying drawings.

[0036] Such as figure 1 As shown, a target detection and visual localization method based on YOLO series includes the following steps:

[0037] (1) Collect the RGB color image of the target to be detected, and make a self-made image set of the target to be detected;

[0038] Specifically, the RGB color image in step (1) is collected by a D435i depth camera fixed directly above the target to be detected; the D435i depth camera has an IMU, a binocular camera and an infrared emitter module, and is used by configuring the ROS environment.

[0039] (2) Annotate the image set, perform data processing, and define training, testing, and verification samples;

[0040] Specifically, the tool for labeling images in step (2) is Labelimg, which marks the coordinates and categories of the target to be detected with a rectangular frame, and outputs in VOC format; each image to be d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a target detection and visual positioning method based on a YOLO series. The method comprises the steps: firstly, obtaining an RGB color image and a depth image of a to-be-captured target; inputting the acquired RGB color image into a YOLO series target detection model built based on a darknet framework to obtain coordinate information, category and confidence; and combining the coordinate information with the depth image information to solve the spatial three-dimensional coordinates of the to-be-detected target. According to the positioning method based on the target detection algorithm provided by the invention, two-dimensional positioning can be carried out on the RGB color image acquired by the depth camera by using a YOLO series target detection algorithm deployed on a darknet framework, and three-dimensional positioning is realized by combining depth information acquired at a special position of the camera. Compared with other target detection algorithms, the YOLO series target detection algorithm has the advantages that the detection speed is higher, but the detection precision is not high, the target detection precision can be ensured, and the detection speed is increased.

Description

technical field [0001] The invention belongs to the fields of machine vision, visual positioning, target detection and deep learning, in particular to a method for target detection and visual positioning based on the YOLO series. Background technique [0002] In recent years, as the technology of deep learning has become more and more mature, the update speed of the target detection model is also accelerating. The existing target detection models can be divided into two categories, one-stage detection algorithm, such as SSD, YOLO, etc. ;Because it does not require the region proposal stage, it can directly generate the category probability and position coordinate value of the object, and the final detection result can be directly obtained after a single detection, which is characterized by faster detection speed; the other type is two-stage Detection algorithms, such as Faster-RCNN, Fast-RCNN, etc.; this type of detection algorithm divides the detection problem into two stag...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06K9/62G06N3/04G06N3/08
CPCG06T7/73G06N3/04G06N3/08G06T2207/10024G06T2207/10028G06V2201/07G06F18/2415G06F18/214
Inventor 曾锦秀魏武
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products