Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Humanoid-head robot device with human-computer interaction function and behavior control method thereof

A technology of human-computer interaction and robotics, applied in toys, instruments, automatic toys, etc., can solve the problems of limited perception function, lack of artificial emotional model and human-computer interaction function, and achieve compact structure and avoid variable definition conflicts Effect

Active Publication Date: 2011-03-23
HARBIN INST OF TECH
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the above-mentioned state of the art, the purpose of the present invention is to provide a humanoid avatar robot device and a behavior control method with human-computer interaction function, so as to solve the problem that existing humanoid avatar robots cannot fully realize the reproduction and perception functions of human facial expressions. Problems with limited and no artificial emotion models and human-computer interaction functions

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Humanoid-head robot device with human-computer interaction function and behavior control method thereof
  • Humanoid-head robot device with human-computer interaction function and behavior control method thereof
  • Humanoid-head robot device with human-computer interaction function and behavior control method thereof

Examples

Experimental program
Comparison scheme
Effect test

specific Embodiment approach 1

[0014] Specific implementation mode one: as Figure 1a , Figure 1b , Figure 2a , Figure 2b , image 3 , Figure 4a , Figure 4b with Image 6 As shown, the humanoid avatar robot device with human-computer interaction function described in this embodiment is composed of three parts: the humanoid avatar robot body, the robot behavior control system and the sensor perception system; the humanoid avatar robot body includes an eye movement unit 1. Upper and lower jaw movement unit 61, artificial lung device 71, facial expression and mouth shape driving mechanism 81, front support plate 7, rear support plate 6, stand 51, facial shell 17 and facial elastic skin 18; eyeball movement unit 1 consists of Two eyeballs 12, an eyeball transmission mechanism, two eyeball servo motors 14, two eyelids 13, an eyelid transmission mechanism, two eyelid servo motors 16 and a servo motor 29; Motor 27 and rotating shaft 28 constitute; Artificial lung device 71 is made of flexible pipe 19, c...

specific Embodiment approach 2

[0022] Embodiment 2: The sensor perception system in this embodiment further includes a tactile sensor, and the tactile sensor is arranged in the middle of the forehead. Other components and connections are the same as those in the first embodiment.

specific Embodiment approach 3

[0023] Embodiment 3: The sensor perception system in this embodiment further includes two temperature sensors, and the two temperature sensors are respectively arranged on the left and right sides of the forehead. Other compositions and connections are the same as those in Embodiment 1 or 2.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a humanoid-head robot and a behavior control method thereof, in particular to a humanoid-head robot device with human-computer interaction function and the behavior control method thereof, solving the problems that the prior humanoid-head robot can not completely realize the reproduction of human facial expressions, has limited perceptive function and does not have manualfeeling models and the human-computer interaction function. The behavior control method comprises the following steps: a sensor perceptive system outputs perceived information to a main control computer for processing; control system software in a robot behavior control system obtains the relative control quantity of a corresponding motor according to the manual feeling models, executes a motion control instruction to output a PWM pulse by a motion control card to drive the corresponding motor to move to an appointed position and realize the human-computer interaction function and various feeling reactions of a robot; and the sensor perceptive system perceives external feeling signals, recognizes corresponding feeling signals and utilizes the manual feeling models to realize the behavior control of the robot. The invention realizes the reproduction of the human facial expressions and has anthropopathic multi-perception function, such as smell, touch, vision, and the like.

Description

technical field [0001] The invention relates to a humanoid avatar robot device and a behavior control method thereof, belonging to the field of robot applications. Background technique [0002] The research on humanoid robot began in the 1960s, and after more than 50 years of development, it has become one of the main research directions in the field of robotics. It integrates many sciences such as machinery, electronics, computers, materials, sensors, and control technology, and represents a country's high-tech development level. The meaning of "humanoid" is that the robot has human-like perception, decision-making, behavior and interaction capabilities. Humanoid avatar robot is an important direction to realize human-computer emotional interaction in the field of humanoid robot research. Emotions can enhance the convenience and trustworthiness of robots, while providing feedback to users about the robot's internal state, goals and intentions. In human-computer interacti...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): A63H13/00A63H3/36G06N3/00G06K9/00
Inventor 吴伟国孟庆梅
Owner HARBIN INST OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products