Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Target object recognition and positioning method based on color images and depth images

A technology of target objects and color images, applied in scene recognition, character and pattern recognition, instruments, etc., can solve problems such as inability to distinguish target objects well, false detection of feature points, etc., to achieve high real-time performance and improve efficiency. Effect

Active Publication Date: 2017-06-13
JIANGSU CAS JUNSHINE TECH
View PDF5 Cites 71 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when the SURF algorithm faces the occlusion of other objects around the target object, it cannot distinguish the target object well, resulting in false detection of feature points.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Target object recognition and positioning method based on color images and depth images
  • Target object recognition and positioning method based on color images and depth images
  • Target object recognition and positioning method based on color images and depth images

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] The present invention will be further described below in conjunction with specific drawings.

[0031] The method for identifying and locating a target object based on a color image and a depth image of the present invention comprises the following steps:

[0032] (1) The mobile robot adopts long-range HSV color recognition, short-range SURF feature point detection, and removes obstacles through depth images to identify target objects;

[0033] Specifically, such as figure 1 Shown: In order to solve the limited working space caused by the fixed position of the camera, the camera and the robotic arm are mounted on the mobile platform and move together with the robot. When receiving an instruction to grab an object, the robot will search for the target object in the surrounding environment through the camera. When the distance from the target object is far away, the feature points of the target object will not be prominent enough at this time, and the recognition accuracy...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a target object recognition and positioning method based on color images and depth images. The method is characterized by comprising the following steps that (1), a target region is confirmed by a robot by the adoption of the remote HSV color recognition, the distance between the robot and the target region is obtained according to the RGB color images and the depth images, and the robot conducts navigation and path planning and moves to the portion near the target region; (2), when the robot reaches the portion near the target region, through the SURF feature point detection, the RGB feature information of the target object is obtained, feature matching is conducted on the RGB feature information and the pre-stored RGB feature information of the target object, and if the feature of the target object accords with an existing object model, the target object is positioned; and (3), the RGB color images are collected to an imaging plane, the two-dimensional coordinates of the target object in the imaging plane are obtained, and the relative distance between the target object and a camera is obtained through the depth images, so that the three-dimensional coordinates of the target object are obtained. By the adoption of the target object recognition and positioning method, the category of the object can be judged quickly, and the three-dimensional coordinates of the object can be determined quickly.

Description

technical field [0001] The invention relates to a method for identifying and locating a target object, in particular to a method for identifying and locating a target object based on a color image and a depth image. Background technique [0002] With the improvement of people's requirements for robot functions, it has become a current development trend that mobile robots have visual functions, and work together with the robotic arm mounted on the mobile platform, "hand-eye" to complete the work. However, the traditional visual recognition positioning method uses monocular or binocular vision for positioning, which is easily affected by changes in illumination and requires a large amount of calculation. Especially for monocular vision, it is necessary to obtain three-dimensional space coordinates by comparing images taken from different angles of the same object, and the calculation is relatively complicated. Therefore, using infrared cameras and receivers to obtain object d...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): B25J9/16G06K9/00G06K9/46
CPCB25J9/1664B25J9/1697G06V20/10G06V20/58G06V10/56G06V10/462
Inventor 宋少博赵旦谱台宪青
Owner JIANGSU CAS JUNSHINE TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products