Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A man-machine cooperation oriented real-time posture detection method for hand-held objects

A real-time detection and object technology, applied in image data processing, instrumentation, computing, etc., can solve problems such as single application scenarios, a large amount of time and data

Active Publication Date: 2019-01-22
DALIAN UNIV OF TECH
View PDF4 Cites 56 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The advantage of this method is high accuracy, but its application scenario is single. When changing to another scenario, it needs a lot of time and data to retrain the model (Zeng A, Yu K T, Song S, et al.Multi-view Self-supervised Deep Learning for6D Pose Estimation in the Amazon Picking Challenge[J].2016:1386-1383.)
Therefore, the above two methods are insufficient in real-time pose detection of handheld objects in the scene.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A man-machine cooperation oriented real-time posture detection method for hand-held objects
  • A man-machine cooperation oriented real-time posture detection method for hand-held objects
  • A man-machine cooperation oriented real-time posture detection method for hand-held objects

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] The technical solutions of the present invention will be further described below in conjunction with specific embodiments and accompanying drawings.

[0068] In the specific implementation process of the present invention, the Kinect sensor is used to take depth images of various orientations of the object to be detected, and finally a complete three-dimensional point cloud model of the object is obtained. The effective viewing angle of the Kinect sensor is 57° in the horizontal direction and 43° in the vertical direction, and its visual range is 0.4 meters to 3.5 meters. The Kinect sensor can generate depth images and color images at a rate of 30 frames per second, and the pixels of both images are 640*480.

[0069] A method for real-time detection of the pose of a hand-held object oriented to human-machine collaboration, comprising the following steps:

[0070] The first step is to collect data and preprocess the point cloud to establish a complete 3D point cloud mod...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a man-machine cooperation oriented real-time posture detection method for hand-held objects, belonging to the technical field of a human-machine cooperation interaction system and the position and posture sensing of an industrial robot to a hand-held working object. The depth images of each part of the object to be detected are captured by a 3D stereoscopic camera, and the local point clouds are aligned and merged into a complete three-dimensional point cloud model of the object. Real-time RGB color images and depth images containing 3D point cloud information of the scene are obtained. The RGB image is segmented automatically to get the pixels representing the object in the image. The corresponding point cloud in the depth image is fused with the pixels to get the RGB-D image with color information of the object in the scene. Using ICP algorithm, the RGB-D image is matched with the complete 3D point cloud image of the object to obtain the position and posture ofthe hand-held object in the scene. The method overcomes the problem of obtaining the exact posture of the hand-held object at the current time, and can be used in a variety of scenes.

Description

technical field [0001] The invention belongs to the technical field of human-computer cooperation interaction system and industrial robot's perception of the position and posture of working objects, and relates to a real-time detection method for the position and posture of a hand-held object oriented to human-computer cooperation. Background technique [0002] Robotics is a relatively young field of modern technology that has grown across traditional engineering boundaries, covering fields such as electrical engineering, mechanical engineering, systems and industrial engineering, computer science, economics, and mathematics. In the 1960s, with the birth of the first programmable robot, the field of robotics developed rapidly. However, from the perspective of its development history, it generally passes through the following three stages. In the early stage, the first-generation robot was called a teaching robot. It mainly taught through the operator, and the internal progr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/70G06T7/80G06T7/90G06T7/136
CPCG06T2207/10028G06T7/136G06T7/70G06T7/80G06T7/90
Inventor 闫飞高雅楠庄严李卓函
Owner DALIAN UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products