Humanoid-head robot device with human-computer interaction function and behavior control method thereof

A technology of human-computer interaction and robotics, applied in toys, instruments, automatic toys, etc., can solve the problems of limited perception function, lack of artificial emotional model and human-computer interaction function, and achieve compact structure and avoid variable definition conflicts Effect

Active Publication Date: 2010-01-06
HARBIN INST OF TECH
View PDF2 Cites 35 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above-mentioned state of the art, the purpose of the present invention is to provide a humanoid avatar robot device and a behavior control method with human-computer interaction function, so as to solve the problem that existing humanoid avatar robots cannot fully realize the reproduction and perception functions of human facial expressions. Problems with limited and no artificial emotion models and human-computer interaction functions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Humanoid-head robot device with human-computer interaction function and behavior control method thereof
  • Humanoid-head robot device with human-computer interaction function and behavior control method thereof
  • Humanoid-head robot device with human-computer interaction function and behavior control method thereof

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0014] Specific implementation manner one: such as Figure 1a , Figure 1b , Figure 2a , Figure 2b , image 3 , Figure 4a , Figure 4b with Image 6 As shown, the humanoid avatar robot device with human-computer interaction function described in this embodiment is composed of a humanoid avatar robot body, a robot behavior control system, and a sensor perception system; the humanoid avatar robot body includes an eye movement unit 1. Upper and lower jaw movement unit 61, artificial lung device 71, facial expression and mouth shape drive mechanism 81, front support plate 7, rear support plate 6, stand 51, face shell 17 and facial elastic skin 18; eye movement unit 1 is composed of Two eyeballs 12, an eyeball transmission mechanism, two eyeball servo motors 14, two eyelids 13, an eyelid transmission mechanism, two eyelid servo motors 16 and a servo motor 29; the upper and lower jaw movement unit 61 consists of the upper jaw 8, the lower jaw 9, The motor 27 and the rotating shaft 2...

specific Embodiment approach 2

[0022] Embodiment 2: The sensor sensing system of this embodiment further includes a tactile sensor, and the tactile sensor is arranged in the middle of the forehead. The other components and connection relationships are the same as in the first embodiment.

specific Embodiment approach 3

[0023] Specific embodiment 3: The sensor sensing system of this embodiment further includes two temperature sensors, and the two temperature sensors are respectively arranged on the left and right sides of the forehead. Other components and connection relationships are the same as those in the first or second embodiment.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a humanoid-head robot and a behavior control method thereof, in particular to a humanoid-head robot device with human-computer interaction function and the behavior control method thereof, solving the problems that the prior humanoid-head robot can not completely realize the reproduction of human facial expressions, has limited perceptive function and does not have manual feeling models and the human-computer interaction function. The behavior control method comprises the following steps: a sensor perceptive system outputs perceived information to a main control computer for processing; control system software in a robot behavior control system obtains the relative control quantity of a corresponding motor according to the manual feeling models, executes a motion control instruction to output a PWM pulse by a motion control card to drive the corresponding motor to move to an appointed position and realize the human-computer interaction function and various feeling reactions of a robot; and the sensor perceptive system perceives external feeling signals, recognizes corresponding feeling signals and utilizes the manual feeling models to realize the behavior control of the robot. The invention realizes the reproduction of the human facial expressions and has anthropopathic multi-perception function, such as smell, touch, vision, and the like.

Description

Technical field [0001] The invention relates to a humanoid head portrait robot device and a behavior control method thereof, belonging to the field of robot application. Background technique [0002] The research on humanoid robots began in the 1960s, and after more than 50 years of development, it has become one of the main research directions in the field of robotics. It integrates multiple sciences such as machinery, electronics, computers, materials, sensors, and control technology, and represents the high-tech development level of a country. The meaning of "human imitation" is that robots have human-like perception, decision-making, behavior and interaction capabilities. Humanoid avatar robot is an important direction to realize human-machine emotional interaction in the research field of humanoid robot. Emotion can improve the convenience and credibility of the robot, and at the same time can provide the user with feedback information such as the internal state, goal and ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): A63H13/00A63H3/36G06N3/00G06K9/00
Inventor 吴伟国孟庆梅
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products