Unlock instant, AI-driven research and patent intelligence for your innovation.

Grabbing robot based on target and scene character matching and grabbing method and system

A robot and target technology, applied in the field of intelligent robots, can solve the problems of insufficient identification and grasping accuracy, affecting the grasping judgment of robots, etc., and achieve the effect of precise grasping control.

Pending Publication Date: 2022-05-13
SHANDONG UNIV
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when there are a large number of objects in the captured scene, and the information such as the appearance and color of the objects are consistent, or belong to the same category of objects, the above detection algorithm cannot distinguish such objects in detail, which directly affects the robot's grasping judgment, resulting in Insufficient grasping accuracy

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Grabbing robot based on target and scene character matching and grabbing method and system
  • Grabbing robot based on target and scene character matching and grabbing method and system
  • Grabbing robot based on target and scene character matching and grabbing method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0038] The present invention proposes a method of using the text information of the captured object to integrate the relevant information provided by the target detection and text detection and recognition algorithms to construct an accurate detection system for a specific grasping target, so as to realize the precise identification and recognition of the target object. Positioning; at the same time, the lightweight model is adopted to ensure the real-time effect of the system, which is convenient for the deployment of grasping tasks on the robot controller, and solves the problem that the current grasping target detection algorithm cannot distinguish similar objects in detail.

[0039] Such as figure 1 As shown, this embodiment provides a grasping robot based on target and scene text matching, including a depth camera, a chassis, a mechanical arm, and a controller;

[0040] The controller includes a preliminary detection module of the target to be grasped and a text detection...

Embodiment 2

[0112] This embodiment provides a capture method based on target and scene text matching, including the following steps:

[0113] Step 1: Obtain the target image to be captured;

[0114] Step 2: According to the image of the target to be captured and the target detection model, CNN is used for feature extraction, and regression is obtained to obtain the classification result and bounding box of the target to be captured;

[0115] Step 3: For targets with the same classification results, the text detection and recognition model is used to extract the text in the target detection frame area for detection and recognition. When the text recognition result matches the specific target successfully, the initial three-dimensional coordinates are obtained;

[0116] Step 4: Use the target tracking algorithm to locate the detection frame of the specific grasping target to obtain the final grasping coordinates, and control the movement of the chassis and the movement of the robotic arm to...

Embodiment 3

[0118] This embodiment provides a capture system based on target-scene text matching, including: a robot and a terminal, where the robot is used to receive capture instructions issued by the terminal;

[0119] The robot includes a preliminary detection module of a target to be grasped, a text detection and recognition module and a target grasping module;

[0120] The preliminary detection module of the target to be captured is used to obtain the target image to be captured; according to the target image to be captured and the target detection model, CNN is used for feature extraction, and regression is obtained to obtain the classification result and the bounding box of the target to be captured;

[0121] The text detection and recognition module is used to extract the text in the target detection frame area by using the text detection and recognition model for the target with the same classification result for detection and recognition, and obtain the initial three-dimensional...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of intelligent robots, and provides a target and scene character matching-based grabbing robot and a grabbing method and system.The method comprises the following steps: according to a to-be-grabbed target image and a target detection model obtained by a camera, performing feature extraction and regression by using a CNN to obtain a classification result and a bounding box of a to-be-grabbed target; for targets with the same classification result, a text detection and recognition model is adopted to extract characters in a target detection frame area for detection and recognition, and initial three-dimensional coordinates are obtained after a character recognition result is successfully matched with a specific target; and a target tracking algorithm is used for positioning the specific grabbing target detection frame, final grabbing coordinates are obtained, and the chassis movement and the mechanical arm action are controlled according to the grabbing coordinates to grab the specific target.

Description

technical field [0001] The invention belongs to the field of intelligent robots, and in particular relates to a grasping robot based on the matching of objects and scene characters, a grasping method and a system thereof. Background technique [0002] The statements in this section merely provide background information related to the present invention and do not necessarily constitute prior art. [0003] In the prior art, most robot grasp detection algorithms directly grasp and detect a single object, or use complex neural networks to segment, classify, mark and other methods to distinguish multiple objects. However, when there are a large number of objects in the captured scene, and the information such as the appearance and color of the objects are consistent, or belong to the same category of objects, the above detection algorithm cannot distinguish such objects in detail, which directly affects the robot's grasping judgment, resulting in Insufficient grasping accuracy. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V30/148G06V20/62G06N3/04B25J9/16
CPCB25J9/16G06N3/045
Inventor 许庆阳刘志超丁凯旋宋勇李贻斌张承进袁宪锋庞豹
Owner SHANDONG UNIV
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More