Human-Computer Interaction System and Working Method Based on Personality and Interpersonal Recognition

A technology of interpersonal relationship and human-computer interaction, applied in the field of computer vision, which can solve problems such as integration

Active Publication Date: 2020-06-30
EMOTIBOT TECH LTD
View PDF12 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In view of the deficiencies of the above problems, the present invention provides a human-computer interaction system and working method based on the recognition of character and interpersonal relationship. Analyzing and judging the relationship between them, choosing appropriate topics to actively communicate with users, it is easier to stimulate users' interest in chatting, making this human-computer interaction more intelligent, more natural, and more humane, solving the problem that existing robots can only Recognize the characters in pictures and videos, but cannot be well integrated into the chat topic

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-Computer Interaction System and Working Method Based on Personality and Interpersonal Recognition
  • Human-Computer Interaction System and Working Method Based on Personality and Interpersonal Recognition
  • Human-Computer Interaction System and Working Method Based on Personality and Interpersonal Recognition

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0040] This embodiment provides a human-computer interaction system based on character and interpersonal relationship recognition, such as Figure 1 to Figure 6 shown, including:

[0041] Detection module: perform face and human detection according to the image or video input by the user, obtain the real area of ​​the object to be tested, that is, the face image and the human body image, and count the number of people in the image;

[0042] Information feature extraction module: extract face image features and human body image features from the face image and human body image regions obtained by the detection module, and analyze face attributes and human body behaviors;

[0043] Information feature integration module: merge the face and body information of each person output by the information feature extraction module into a fixed-dimensional feature vector;

[0044] Personality preference discrimination module: Input the feature vector of each person integrated by the infor...

example 1

[0067] Example 1: When a boy is chatting with a robot, he inputs a selfie. First, he conducts face and human body detection and finds that there is only one person. Then, he performs face attribute analysis, human body key point positioning and behavior analysis on this person, and Extract the image features of human face and human body, integrate these information, and input it into the personality preference identification module to analyze whether he belongs to the strong, decisive and conceited type, the enthusiastic, lively and changeable lively type, the easy-going, friendly and reticent peaceful type, or the meticulous The sensitive and pessimistic perfect type, based on this personality information, matches the preferences and chat styles of each personality in the previous database, and the chat robot chooses appropriate topics and methods to communicate with it.

example 2

[0068] Example 2: When a girl is chatting with a chatbot, she uploads a selfie of herself holding hands with her boyfriend. The system will perform face and body detection on this photo to obtain the location of the face and body area and the number of people. Secondly, analyze the age, gender, expression and face value of each person's face, as well as the key points of the body and behaviors such as hugging, holding hands, approaching and staying away, combined with the image features of the extracted face and human body , carry out information integration, integrate into a fixed-length feature vector, and then input it into the relationship discrimination module, and it is determined that the relationship between the two in the figure is a couple. Finally, based on this relationship, the chatbot takes the initiative to greet the user, such as: "Is the boy next to you your boyfriend? He's so handsome."

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the field of computer vision and human-computer interaction, and provides a human-computer interaction system and working method based on the recognition of character and interpersonal relationship, including: a detection module: detecting pictures or videos to obtain human face and human body images Real area; information feature extraction module: extract face and body image features, analyze face attributes and human behavior; information feature integration module: combine the face and human body information output by the previous module to form the feature vector of each person; personality preferences Discrimination module: analyze character preferences according to the input feature vector; relationship discrimination module: judge the relationship between characters in the picture or video; man-machine dialogue module: select topics to communicate with users according to character preferences and relationships. The present invention analyzes the personalities and preferences of characters in pictures or videos and the relationship between characters, selects appropriate topics to actively communicate with users, and makes the computer-computer interaction mode more intelligent, more friendly and natural, and more humanized.

Description

technical field [0001] The invention belongs to the fields of computer vision and human-computer interaction, and in particular relates to a human-computer interaction system and working method based on the recognition of character and interpersonal relationship. Background technique [0002] With the development of science and technology, the development of almost all modern technologies involves artificial intelligence technology. It can be said that artificial intelligence has been widely used in many fields. It is a comprehensive subject developed from the interpenetration of computer science, cybernetics, information theory, neurophysiology, psychology, linguistics and other disciplines. From the perspective of computer application systems, artificial intelligence is a science that studies how to manufacture intelligent machines or intelligent systems to simulate the ability of human intelligence activities to extend people's intelligence. [0003] At present, chatbots...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G06K9/00G06K9/62G06K9/46
CPCG06V40/161G06V40/20G06V40/10G06V10/507G06V10/462G06F18/22G06F18/2414G06F18/2415
Inventor 简仁贤潘一汉刁玉贤张惠棠杨闵淳
Owner EMOTIBOT TECH LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products