Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method used for intelligent robot system to achieve real-time face tracking

An intelligent robot, real-time tracking technology, applied in the field of intelligent robots, can solve the problems of complex calculation, difficult continuous face tracking, etc., to simplify the tracking algorithm, improve the human-computer interaction experience, and reduce the system cost.

Inactive Publication Date: 2016-07-13
BEIJING GUANGNIAN WUXIAN SCI & TECH
View PDF6 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the modeling method and tracking algorithm generally need complex calculations, which will take up a lot of system resources, and it is difficult to achieve real-time continuous tracking of faces.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method used for intelligent robot system to achieve real-time face tracking
  • Method used for intelligent robot system to achieve real-time face tracking
  • Method used for intelligent robot system to achieve real-time face tracking

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] figure 1 It is a schematic flow diagram of a method for an intelligent robot system to track a human face in real time according to an embodiment of the present invention, figure 2 It is a schematic structural diagram of an intelligent robot system capable of real-time tracking of human faces according to an embodiment of the present invention. From figure 2 It can be seen that the intelligent robot system 20 mainly includes a camera 21 , an Android board 22 , a main control board 23 and an execution device 24 .

[0027] further as figure 2 As shown, the Android board 22 is mainly provided with a first processor 221 and a data transmission circuit 222 and a storage unit 223 connected to the first processor, and the main control board 23 is mainly provided with a second processor 231, A data receiving circuit 232 , a motor drive control circuit 233 and a steering gear output interface circuit 234 connected to the second processor 231 . Wherein, the camera 21 is co...

Embodiment 2

[0042] Figure 4 It is a schematic flowchart of a method for an intelligent robot system to track a human face in real time according to another embodiment of the present invention. combine figure 2 and Figure 4 First, call the camera 21 according to the received multi-modal input instruction to obtain a detected image including a human face, and store the preprocessed image data in the storage unit 223 arranged on the Android board 22. This step is consistent with the embodiment The corresponding steps in No. 1 and No. 1 are the same and will not be repeated here. Then the first processor 221 reads the image data from the storage unit 223, and recognizes the face information in the detected image.

[0043] In practice, there may be more than one human face information in the detection image collected by the camera 21, so in this embodiment, the number of human faces contained in the detection image is first judged, if only one person is included in the detection image f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method used for an intelligent robot system to achieve real-time face tracking, comprising the steps of: receiving a multi-modal input command, calling a camera according to the multi-modal input command to obtain a detection image containing a face; utilizing a processor on an Android board to obtain position information of the face in the detection image, and performing judgment based on the position information and a preset face position; when the position of the face in the detection image is not consistent with the preset face position, utilizing a processor on a main control board to control a robot to move, and simultaneously outputting multi-modal output corresponding to the multi-modal input command; and re-acquiring a detection image containing a face, obtaining position information of the face in the detection image, and performing judgment based on the position information and the preset face position until the position of the face in the detection image is consistent with the preset face position. The method simplifies a tracking algorithm, reduces system cost, and achieves real-time and continuous tracking for the face.

Description

technical field [0001] The invention relates to the field of intelligent robots, in particular to a method for an intelligent robot system to track human faces in real time. Background technique [0002] With the development of robot technology, intelligent robot products have increasingly penetrated into all aspects of people's lives. Robots are not only used to help users complete specified tasks efficiently, but are also designed as partners who can interact with users in language, movement and emotion. [0003] A commonly used method in the process of human-to-human interaction is face-to-face communication, because it is easier to understand the intention of the interacting party and respond to the emotional expression of the interacting party. Similarly, in the field of intelligent robots, human face, as an important visual image, can convey the user's age, gender, identity, and most of the emotional and emotional information. Therefore, in the process of human-compu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G05B19/04G06K9/00
CPCG05B19/04G06V40/161
Inventor 贾梓筠韩冬
Owner BEIJING GUANGNIAN WUXIAN SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products