Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Human-computer interaction method and device based on facial features

A facial feature, human-computer interaction technology, applied in the field of human-computer interaction, can solve the problems of fluctuation of mentality in writing, influence of psychological state, easy ambiguity in speech recognition, etc., to achieve the effect of maintaining continuity and avoiding damage

Pending Publication Date: 2021-11-26
北京聚匠艺传媒有限公司
View PDF10 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The manual click control method requires the learner to put down the pen, change the writing posture, and cause fluctuations in the writing mentality
Voice control, playback and pause requires learners to switch from static to dynamic, which will affect their psychological state, and voice recognition is prone to ambiguity, and it is not suitable for multiple people learning at the same time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method and device based on facial features
  • Human-computer interaction method and device based on facial features
  • Human-computer interaction method and device based on facial features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In describing the present invention, it should be understood that the terms "center", "longitudinal", "transverse", "length", "width", "thickness", "upper", "lower", "front", " Orientation or position indicated by "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", etc. The relationship is based on the orientation or positional relationship shown in the drawings, and is only for the convenience of describing the present invention and simplifying the description, rather than indicating or implying that the referred device or element must have a specific orientation, be constructed and operated in a specific orientation, therefore It should not be construed as a limitation of the present invention.

[0028] In addition, the terms "first" and "second" are used for descriptive purposes only, and cannot be interpreted as indicating or implying relative importance or implicitly specifying the quantity of indic...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a human-computer interaction method and device based on facial features, and the method comprises the steps of continuously obtaining a plurality of user images when a preset video is played, and extracting a user face image in each user image; extracting facial feature information in each user face image; determining the pose change information of the user according to the change value of the facial feature information of the plurality of user face images and the time interval between the collected user face images, wherein the pose change information comprises an angle change value and an angular velocity change value; and generating a corresponding control instruction according to the pose change information, and executing a corresponding control operation on the preset video based on the control instruction. According to the human-computer interaction method and device based on the facial features, the current motion state of the face of the user is judged by detecting the biological features and the motion posture of the face of the human body, then control signals such as pause or playing can be sent out, and video playing control through face motion is achieved.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, in particular, to a method and device for human-computer interaction based on facial features. Background technique [0002] In the process of calligraphy education, the body state includes the posture of holding the pen, hand movements, body posture, etc. Changes in mental state include shifting focus of attention, fluctuations in learning interest, interruption of learning rhythm, etc. In the process of traditional calligraphy informatization teaching, students can practice calligraphy while watching the teaching video, but when watching the teaching video, the students may be required to operate the equipment that plays the teaching video, which may change the physical or psychological state of the students, physical and psychological changes It constitutes a major obstacle in the learning process of students. [0003] The existing methods for controlling playback ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06F3/01
CPCG06F3/011G06F2203/011
Inventor 李华栋
Owner 北京聚匠艺传媒有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products