Target identification method based on deep learning network and hot-line robot

A deep learning network and target recognition technology, applied in the field of target recognition of live working robots, can solve the problems of cumbersome operation steps, low error tolerance rate, slow operation, etc., and achieve the effect of high target recognition rate, good recognition performance, and reduction of misrecognition.

Inactive Publication Date: 2017-08-18
NANJING UNIV OF SCI & TECH
View PDF4 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But it has the following disadvantages: 1. The operator is required to have a high degree of technical proficiency, which means that the operator must be trained strictly and professionally, which consumes manpower and material resources, and the efficiency is not high.
2. Manual master-slave operation steps are cumbersome and slow
3. The fault tolerance rate in the operation process is low, which may damage the robotic arm and even affect the progress of the operation
These methods are highly data-dependent, and the accuracy of recognition is often not high when the object is in a complex background.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target identification method based on deep learning network and hot-line robot
  • Target identification method based on deep learning network and hot-line robot
  • Target identification method based on deep learning network and hot-line robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] It is easy to understand that according to the technical solution of the present invention, without changing the essential spirit of the present invention, those of ordinary skill in the art can imagine various embodiments of the present invention based on the deep learning network target recognition method and the live working robot . Therefore, the following specific embodiments and drawings are only exemplary descriptions of the technical solutions of the present invention, and should not be regarded as all of the present invention or as a limitation or limitation to the technical solutions of the present invention.

[0037] With reference to the accompanying drawings, a method for target recognition of a live working robot based on a deep learning network includes the following steps:

[0038] Step 1. Collect pictures of job objects and establish a target database. The pictures of the subjects were taken from the Internet and on-site.

[0039] Step 2: Divide the target d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a target identification method based on a deep learning network and a hot-line robot. Weights and offsets on each layer in a convolution neural network are initialized; two-dimensional image data is input for convolution to obtain C1-layer data; the maximum pooling treatment is performed for the C1-layer data to reduce the magnitude of the data to obtain S2-layer data; the convolution is performed for the S2-layer data to obtain C3-layer data; the maximum pooling treatment is performed for the C3-layer data to further reduce the magnitude of the data to obtain S4-layer data; the convolution is performed for the S4-layer data to obtain C5-layer data; the C5-layer data serves as input to totally connect with a H6 layer, and an output layer is totally connected with the H6 layer to output learning results of target characteristics; and the weights and the offsets are finely adjusted by applying a back propagation algorithm to finish the network learning. The method is high in target identification rate and excellent in robustness under complicated background.

Description

Technical field [0001] The invention relates to the field of target recognition of live working robots, in particular to a target recognition method based on a deep learning network and a live working robot. Background technique [0002] The introduction of live working robots into the power industry instead of manual power maintenance and repair work can effectively avoid the occurrence of casualties during live work, and can greatly improve the efficiency of the power industry. At present, live working robots mainly adopt the remote operation method of the master-slave manipulator. The operator uses the master manipulator to remotely control the slave manipulator, and their personal safety is guaranteed to a certain extent. But it has the following shortcomings: 1. The operator is required to have a high degree of technical proficiency, which means strict and professional training for the operator, consumes manpower and material resources, and is not efficient. 2. The manual m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): B25J9/16B25J9/04G06N3/08G06K9/62
CPCG06N3/084B25J9/04B25J9/1656G06F18/214
Inventor 郭毓姚伟郭健李光彦吴巍苏鹏飞吴禹均韩昊一黄颖汤冯炜林立斌
Owner NANJING UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products