Unlock instant, AI-driven research and patent intelligence for your innovation.
A Human-Computer Interaction Operating System Based on Facial Action Recognition
What is Al technical title?
Al technical title is built by PatSnap Al team. It summarizes the technical point description of the patent document.
A technology of operating system and face movement, applied in the field of human-computer interaction, can solve the problems of unnatural interaction, achieve fast processing speed, low hardware requirements, and good real-time performance
Active Publication Date: 2022-03-29
SOUTHEAST UNIV
View PDF7 Cites 0 Cited by
Summary
Abstract
Description
Claims
Application Information
AI Technical Summary
This helps you quickly interpret patents by identifying the three key elements:
Problems solved by technology
Method used
Benefits of technology
Problems solved by technology
[0007] Aiming at the problems that the traditional keyboard, mouse, touch and other human-computer interaction methods of intelligent electronic products are easy to cause fatigue damage to the eyes and cervical spine, and the interaction is unnatural, this patent proposes a non-contact interaction through facial movements. Simple and convenient control of commonly used functions, the interaction is more natural, and at the same time, the user is encouraged to actively perform facial and neck movements, so as to achieve natural interaction and reduce physical damage. At the same time, the disabled can also complete the control of the smart terminal and realize information sharing. accessibility of
Method used
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more
Image
Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
Click on the blue label to locate the original text in one second.
Reading with bidirectional positioning of images and text.
Smart Image
Examples
Experimental program
Comparison scheme
Effect test
Embodiment 1
[0043] Such as figure 1 Shown, a kind of human-computer interaction operating system based on facial action recognition, described system comprises:
[0044] 1. Image acquisition module, including camera and image preprocessing unit;
[0045] 2. The face action recognition module, including the position and action detection unit of the face, eyes, nose and mouth, and the face identity authentication unit;
[0046] 3. Host module: including central processing unit, storage unit, data and control bus, display, power supply and its management unit and other peripheral units, operating system, interactive control unit, application program; the operating system is WINDOWS, LINUX, ANDROID , IOS or other derivative operating systems, and the interactive host module is one of desktop computers, workstations, notebook computers, mobile phones, and tablet computers.
Embodiment 2
[0048] In this embodiment, the specific workflow is as follows figure 2 As shown, the working method of the above system includes:
[0049] After the system is powered on, the host module completes the connection and initialization of other modules of the system, and the image acquisition module starts to collect video in real time and preprocess the image; The camera collects video images in real time, and performs preprocessing on the captured images of each frame, such as grayscale, filtering and denoising, contrast enhancement, zooming and mirror flipping, etc.
[0050] When the face action recognition module detects that the video captured by the camera contains a face, it performs face identity authentication. After the authentication is successful, it tracks the area where the face is located to determine the position and movement of the face, eyes, nose, and mouth, and wakes up The interactive control unit of the host module;
[0051] According to the data obtained ...
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
PUM
Login to View More
Abstract
The invention belongs to the field of human-computer interaction, in particular to a human-computer interaction operating system based on facial action recognition. Firstly, the video is collected in real time through the camera, and preprocessing such as mirror flip is performed on each frame of the image; the classifier is used to detect the face area, and the feature areas such as eyes and mouth are detected in the rectangular area where the face is located, and the feature points of the face are extracted at the same time , perform face identity authentication and calculate the direction of facial movement and instantaneous speed; secondly, use the frame difference method to detect facial movements such as front and rear movement of the face, eyes, and mouth through preset thresholds; finally, according to the detected location parameters of each area And the action completes the corresponding mouse movement, click, scroll wheel control, touch function control and simulated keyboard combination shortcut keys in the three modes of system, application and general, and can realize the advanced switching between different application modes through simple actions, realizing replacement Traditional manual manipulation methods such as mouse and keyboard, touch, etc., realize the non-contact human-computer interaction function.
Description
technical field [0001] The present invention relates to the field of human-computer interaction technology, in particular to a human-computer interaction operating system based on facial action recognition. Background technique [0002] With the rapid development of contemporary computers and the Internet, the advent of the era of informatization and intelligence is already unstoppable. In addition, the newly born Internet of Things has also set off the third wave of development of the world's information industry structure upgrading and transformation, and has become an emerging strategic industry with new economic growth points and high-quality market benefits. At the same time, various human-computer interaction (HCI) methods have emerged. Human-computer interaction is a multidisciplinary field that combines theories and practical experience of many different disciplines such as computer science, anthropology, cognitive and behavioral psychology, and industrial design. ...
Claims
the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More
Application Information
Patent Timeline
Application Date:The date an application was filed.
Publication Date:The date a patent or application was officially published.
First Publication Date:The earliest publication date of a patent with the same application number.
Issue Date:Publication date of the patent grant document.
PCT Entry Date:The Entry date of PCT National Phase.
Estimated Expiry Date:The statutory expiry date of a patent right according to the Patent Law, and it is the longest term of protection that the patent right can achieve without the termination of the patent right due to other reasons(Term extension factor has been taken into account ).
Invalid Date:Actual expiry date is based on effective date or publication date of legal transaction data of invalid patent.