Face detecting and tracking method and device and method and system for controlling rotation of robot head

Active Publication Date: 2018-02-01
UBTECH ROBOTICS CORP LTD
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

The face detecting and tracking method described in this patent improves the accuracy of detecting and tracking faces in images. It uses depth detection to determine the depth value of each pixel in an image, which helps to identify areas that may contain faces. The method then creates a tracking box around these areas to ensure continuous detection of the face. The method can be used with a built-in camera device and can control a robot head to rotate to position the tracking box in the center of the image. Overall, this method improves the accuracy and efficiency of face detection and tracking in image processing.

Problems solved by technology

Some of the conventional human face detecting and tracking methods have high detection errors, which may tend to recognize non-human faces as human faces.
Because of the high detection errors, the control of the rotation process of the robot heads is not accurate when controlling the robot heads to rotate using the conventional human face detecting and tracking methods.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Face detecting and tracking method and device and method and system for controlling rotation of robot head
  • Face detecting and tracking method and device and method and system for controlling rotation of robot head
  • Face detecting and tracking method and device and method and system for controlling rotation of robot head

Examples

Experimental program
Comparison scheme
Effect test

embodiment 1

[0016]FIGS. 1 and 2 show flow charts of a human face detecting and tracking method according to one embodiment As shown in FIG. 1, the method includes the following steps:

[0017]Step S11: acquiring an image and performing a depth detection for the image to obtain a depth value of each pixel of the image. In the embodiment, each image includes I×J pixels, i.e., I rows and J columns of pixels. A pixel (i, j) refers to the pixel in column i, row j of the image, and the depth value of the pixel (i, j) is represented as d (i, j). One or more depth sensors are used to perform depth detection and process to the captured images to acquire the depth value d (i, j) of each pixel (i, j).

[0018]Step S12: determining one or more face candidate areas based on depth value d (i, j) of each pixel (i, j) of the Image of current frame. A face candidate area is an area that may include a human face. A human face detecting method is used to detect a human face in the face candidate area, which increases t...

embodiment 2

[0037]FIG. 4 shows a flow chart of a method for controlling rotation of a robot head. The robot is equipped with a camera device for capturing images, and a depth sensor and a processor for detecting depth of the captured images. The method for controlling rotation of a robot head is performed by the processor and includes the following steps:

[0038]Step S21: determining a tracking box of a first image using the method described in embodiment 1, and a center of the tracking box. As shown in FIG. 5, the coordinates (x0, y0) of the center B of the tracking box of the first image are read in real-time. It can be understood when tracking a face using the face detecting and tracking method of the embodiment 1, the steps of determining firstly one or more face candidate areas according to the depth value of each pixel, and then performing a face detection in the one or more face candidate areas, can reduce detection error and increase face detection accuracy. The face detecting and trackin...

embodiment 3

[0047]FIG. 6 shows a block diagram of a face detecting and tracking device 1. The face detecting and tracking device 1 includes a depth detecting and processing module 10 that is used to acquire an image and perform a depth detection for the image to obtain a depth value of each pixel of the image. In the embodiment, each image includes I×J pixels, i.e., I rows and J columns of pixels. A pixel (i, j) refers to the pixel in column i, row j of the image, and the depth value of the pixel (i, j) is represented as d (i, j). One or more depth sensors are used to perform depth detection and process to the captured images to acquire the depth value d (i, j) of each pixel (i, j).

[0048]The face detecting and tracking device 1 also includes a face candidate area determining module 20 that is used to determine one or more face candidate areas based on depth value d (i, j) of each pixel (i, j) of the image of current frame. A face candidate area is an area that may include a human face. A human ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

A face detecting and tracking method includes: acquiring an image and performing a depth detection for the image to obtain a depth value of each pixel of the image; determining one or more face candidate areas based on depth value of each pixel of the image of current frame; performing a face detection to the one or more face candidate areas to determine one or more face boxes of the image of current frame; and determining a tracking box of the image of current frame based on the one or more face boxes and a tracked face box, and tracking the face in the tracking box of the image of current frame.

Description

CROSS REFERENCE TO RELATED APPLICATIONS[0001]This application claims priority to Chinese Patent Application No. 201610619340.1, filed Jul. 29, 2016, which is hereby incorporated by reference herein as if set forth in its entirety.BACKGROUND1. Technical Field[0002]The present disclosure generally relates to human face recognition technology, and particularly to a human face detecting and tracking method and device, and a method and system for controlling rotation of a robot head.2. Description of Related Art[0003]As computer vision technology becomes mature, it has been widely used in daily lives. For example, it can be used to control a robot and enables the robot to have visual function, thereby performing various detecting, determining, recognizing and measuring operations. Some conventional, robots are equipped with a camera and are able to detect human feces in the images captured by the camera using face detecting technology, and track a detected human face, causing head of the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00B25J9/00G06T7/00G06V10/764
CPCG06K9/00228B25J9/0003G06T7/0051G06K9/00268G06V40/166G06V40/172G06T2200/04G06T2207/10016G06T2207/10028G06T2207/30201G06T7/248G06V20/64G06V40/161G06V40/168G06V10/446G06V10/764G06F18/24155G06T7/50G06T7/246B25J9/1684B25J19/02
Inventor JI, YUANXIONG, YOUJUN
Owner UBTECH ROBOTICS CORP LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products