3D (three-dimensional) sight direction estimation method for robot interaction object detection

A technology of interactive objects and line-of-sight direction, applied in the acquisition, instrumentation, calculation and other directions of 3D object measurement, which can solve the problems of lack of robustness and other problems

Active Publication Date: 2015-09-30
UNIV OF ELECTRONICS SCI & TECH OF CHINA
View PDF4 Cites 41 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Therefore, in recent years, some researchers have begun to apply gaze direction estimation technology to social service robots, to realize that the robot automatically locks the user, and the user can control the robot through the eyes; for gaze direction estimation, research in recent years is mainly divided into two categories: Direction: one is estimated by neural network; the other is realized by eye feature analysis. Estimating line of sight direction according to neural network has high precision and real-time performance, but lacks good robustness, because when the environment changes Need to rebuild the model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • 3D (three-dimensional) sight direction estimation method for robot interaction object detection
  • 3D (three-dimensional) sight direction estimation method for robot interaction object detection
  • 3D (three-dimensional) sight direction estimation method for robot interaction object detection

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0068] The technical solution of the present invention will be further described in detail below in conjunction with the accompanying drawings, but the protection scope of the present invention is not limited to the following description.

[0069] Such as figure 1 As shown, a 3D gaze direction estimation method for robot interactive object detection, it includes the following steps:

[0070] S1. Perform head pose estimation: use RGBD sensors to collect color information and depth information, and calculate the three-dimensional position information T and head pose R of the head according to the collected information;

[0071] S2. Calculate the mapping matrix M between the head pose R and the head reference pose R0, where the head reference pose R0 is the head pose when the user and the robot face each other, R0=[0,0,1] ;

[0072] S3. Collect the human eye picture, and extract the human eye area image from the collected human eye picture;

[0073] S4. After obtaining the ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a 3D (three-dimensional) sight direction estimation method for robot interaction object detection. The method comprises steps as follows: S1, head posture estimation; S2, mapping matrix calculation; S3, human eye detection; S4, pupil center detection; S5, sight direction calculation; S6, interaction object judgment. According to the 3D (three-dimensional) sight direction estimation method for robot interaction objection detection, an RGBD (red, green, blue and depth) sensor is used for head posture estimation and applied to a robot, and a system only adopts the RGBD sensor, does not require other sensors and has the characteristics of simple hardware and easiness in use. A training strong classifier is used for human eye detection and is simple to use and good in detection and tracking effect; a projecting integral method, a Hough transformation method and perspective correction are adopted when the pupil center is detected, and the obtained pupil center can be more accurate.

Description

technical field [0001] The invention relates to a method for estimating a 3D line of sight direction for robot interactive object detection. Background technique [0002] Human-computer interaction (Human-Computer Interaction, HCI) is the study of communication and communication between humans and computers through mutual understanding, to complete the functions of information management, service and processing for people to the greatest extent, and to make computers truly a tool for people to work and study. A technical science of Harmony Assistant. [0003] As an important branch of human-computer interaction technology, line-of-sight estimation technology mainly studies the detection and recognition of human eye movement characteristics, and realizes automatic control of other functional systems; the biggest advantage of this technology is that it can realize the control of external devices through eye gaze , and then realize multi-task operation; relevant statistics sho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/20G06K9/00
CPCG06V40/197G06V10/235G06V2201/06G06V2201/12G06F18/40G06F18/2155
Inventor 程洪姬艳丽谢道训杨路谢非
Owner UNIV OF ELECTRONICS SCI & TECH OF CHINA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products