Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot weak light environment grabbing detection method based on multi-task sharing network

A technology of shared network and detection method, which is applied in the field of grasping and detection in low-light environment of robots, can solve the problems of low detection accuracy, achieve the effects of reducing enterprise operating costs, cost controllable, and expanding the scope of working hours

Active Publication Date: 2021-06-11
SHANXI UNIV
View PDF7 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem of low detection accuracy in low-contrast and low-light environments, the present invention provides a robot-based low-light environment grasping and detection method based on a multi-task sharing network

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot weak light environment grabbing detection method based on multi-task sharing network
  • Robot weak light environment grabbing detection method based on multi-task sharing network
  • Robot weak light environment grabbing detection method based on multi-task sharing network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0048] figure 1 It is a flow chart of the present invention, a method for grasping and detecting a low-light environment of a robot based on a multi-task sharing network, comprising the following steps:

[0049] Step 1, collect and construct the corresponding data set d, Data set d consists of 4n pairs of "low-light image-normal-light image-captured annotation information" pairs, where Indicates the jth low-light environment image in scene i, Represents the normal illumination image under scene i, b i Indicates the capture frame annotation parameters in scene i, n indicates the number of scenes in the data set, and j indicates the number of low-light images taken in each scene. In order to ensure the strict matching of images taken in low-light environments and normal lighting environments, image acquisition is performed. ; until the complete data set d is obtained;

[0050] Step S1.1 Build a darkroom with no light inside, and install dimmable LED bulbs at the bottom ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a robot weak light environment grabbing detection method based on a multi-task sharing network, and belongs to the technical field of computer vision and intelligent robots. According to the method, strict matching of shot images in a weak light environment and a normal light environment is ensured through image acquisition, a corresponding data set d is constructed, then Darknet-53 is adopted as a backbone network to construct a weak light environment capture detection model, multi-scale features with strong feature expression ability are extracted, and, through a parallel cascade capture detection module and an image enhancement module, capturing detection and weak light image enhancement tasks are realized respectively; training samples are randomly selected from the data set d, a weak light environment capture detection model is used for prediction, and when the change of a loss value within the iteration number iter is smaller than a threshold value t, the capture detection model G is converged, that is, training of the model G is completed; and an image Ilow shot in the weak light environment is input into the trained model G to obtain a predicted capture frame parameter and an enhanced image, and a capture detection task in the weak light environment is completed.

Description

technical field [0001] The invention belongs to the technical field of computer vision and intelligent robots, and in particular relates to a method for grabbing and detecting robots in weak light environments based on a multi-task sharing network. Background technique [0002] With the continuous development of artificial intelligence, robot control and perception technology, robots equipped with multi-degree-of-freedom robotic arms can autonomously realize the grasping operation of different target objects, and carry out advanced human-computer interaction. Fields are playing an increasingly important role. A typical robot grasping process generally includes four steps: target positioning, pose estimation, grasping detection, and motion planning. Among them, grasping detection can determine the graspable parts of the object, which directly determines the stability and stability of the robot's grasping operation. Accuracy plays an important role. At present, the robot gra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32G06N3/04B25J9/16
CPCB25J9/1697G06V20/10G06V10/25G06N3/045
Inventor 陈路钱宇华吴鹏王克琪
Owner SHANXI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products