Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Artificial head robot with facial expression and multiple perceptional functions

A facial expression and perception function technology, applied in the field of humanoid avatar robots, can solve the problems of single expression and no multi-sensing function of robots, and achieve the effects of high cost performance, small size and large torque

Active Publication Date: 2009-06-17
HARBIN INST OF TECH
View PDF2 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to provide a humanoid avatar robot with facial expressions and multi-sensing functions in order to solve the problem that existing robots have a single expression and do not have multi-sensing functions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Artificial head robot with facial expression and multiple perceptional functions
  • Artificial head robot with facial expression and multiple perceptional functions
  • Artificial head robot with facial expression and multiple perceptional functions

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0011] Specific implementation mode one, the following combination figure 1 , Figure 4 Describe this embodiment, this embodiment includes a robot body 1, a motion control system 2, a master computer 3 and a multi-sensory sensing system 4,

[0012] The robot body 1 includes a motion drive mechanism 5, an expression drive mechanism 6, a facial shell 7 and a facial elastic skin 8,

[0013] The motion control system 2 includes a steering gear controller 9 and a plurality of steering gears 10,

[0014] The information sensed by the multi-sensory sensing system 4 is output to the main control machine 3 for processing, and the main control machine 3 outputs corresponding command information to the steering gear controller 9, and the steering gear controller 9 outputs PWM pulses to drive the corresponding steering gear 10 to rotate To the designated position, the steering gear 10 drives the movement drive mechanism 5 to move the lip skin together to realize the function of the robo...

specific Embodiment approach 2

[0019] Specific implementation mode two, the following combination figure 2 Describe this embodiment, the difference between this embodiment and Embodiment 1 is that the multisensory sensing system 4 includes 2 temperature sensors 33, 5 tactile sensors 34, 2 visual sensors 35, olfactory sensors 36 and auditory sensors 37 , two temperature sensors 33 are respectively arranged on the left and right sides of the forehead, one tactile sensor 34 is arranged on the middle of the forehead, two tactile sensors 34 are respectively arranged on the left and right cheeks, and two visual sensors 35 are respectively arranged in the left and right eye sockets , the olfactory sensor 36 is arranged at the nose position, and the auditory sensor 37 is arranged at the middle of the chin, and the others are the same as those in the first embodiment.

[0020] figure 2 Shown is the distribution diagram of each sensor in the multi-sensory sensing system. In this embodiment, the temperature sensor...

specific Embodiment approach 3

[0022] Specific implementation mode three, the following combination Figure 4 , Figure 5 Describe this embodiment. The difference between this embodiment and Embodiment 1 is that the motion drive mechanism 5 includes an eyelid drive mechanism, an eyeball drive mechanism and a jaw drive mechanism. The eyelid drive mechanism is used to drive the eyelid 17 to move up and down. The eyeball drive mechanism It is used to drive the eyeball 24 to rotate left and right and up and down, and the lower jaw drive mechanism is used to drive the lower jaw to move up and down. By cooperating with the mouth shape drive mechanism, the function of imitating the shape of the mouth is realized. Others are the same as the first embodiment.

[0023] The motion driving mechanism 5 has 4 degrees of freedom, wherein the eyeball 24 has 2 degrees of freedom, the eyelid 17 has 1 degree of freedom, and the jaw 22 has 1 degree of freedom.

[0024] Such as Figure 4 As shown, the head organs include eyeb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A human head portrait robot having facial expression and multi-sensory function belongs to the robot application technical field, for resolving the problems of prior robots such as single expression and no multi-sensory function. The robot body comprises a motion drive mechanism, an expression drive mechanism, a face shell and a face elastic skin. A motion control system comprises a steering gear controller and a plurality of steering gears; the information sensed by a multi-sensory sensing system is output to a master controller to be processed; the master controller outputs corresponding commands to the steering gear controller; the steering gear controller outputs PWM pulse to drive corresponding steering gears to rotate to appointed positions; the steering gears drive the motion drive mechanism to drive the skin at lip, thereby imitating human mouth by the robot, and the steering gears drive the expression drive mechanism to drive the elastic skin on face to realize various face expression by the robot.

Description

technical field [0001] The invention relates to a humanoid head portrait robot with facial expression and multi-perception functions, belonging to the field of robot applications. Background technique [0002] The research on humanoid robot began in the 1960s, and after more than 50 years of development, it has become one of the main research directions in the field of robotics. It integrates many sciences such as machinery, electronics, computers, materials, sensors, and control technology, and represents a country's high-tech development level. The meaning of "humanoid" is that the robot has human-like perception, decision-making, behavior and interaction capabilities. Humanoid robots not only have a human-like appearance, but more importantly, have a human-like sensory system, a human-like intelligent way of thinking, a control system, and decision-making capabilities, and finally perform "human-like behavior". In human-computer interaction, make robots have "affinity" ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/00
Inventor 吴伟国孟庆梅鹿麟
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products