Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human-computer interaction method based on eye movement control

A technology of human-computer interaction and eye movement features, applied in the input/output of user/computer interaction, computer components, mechanical mode conversion, etc. Control and other issues, to achieve the effect of natural interaction and reduce energy consumption

Active Publication Date: 2018-09-28
BEIJING INST OF COMP TECH & APPL
View PDF15 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] Although inaccurate interaction methods can achieve the effect of human-mobile interaction to a certain extent, there are also obvious shortcomings.
Voice interaction is the process of realizing action interaction through the recognition of input voice, so it has high requirements for the external environment. When the environment is too noisy or the environment is complex, the clarity of the voice contract and the accuracy of voice recognition will be affected. In addition, gesture-based interaction refers to an action language that expresses wishes and conveys commands through the positions and shapes of people's arms, palms, and fingers. The requirements for gesture control are high, the accuracy of the range of motion is difficult to control, and it increases the burden on the hands

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human-computer interaction method based on eye movement control
  • Human-computer interaction method based on eye movement control
  • Human-computer interaction method based on eye movement control

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] In order to make the purpose, content, and advantages of the present invention clearer, the specific implementation manners of the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments.

[0025] figure 1 Shown is the basic flowchart of the human-computer interaction method based on eye movement control in the present invention, as

[0026] figure 1 As shown, the human-computer interaction method based on eye movement control in the present invention is mainly divided into human eye pupil positioning and eye movement feature extraction, specifically including:

[0027] Perform pupil center positioning based on grayscale information, including:

[0028] In order to obtain eye movement information in more detail and improve the accuracy of eye movement control, a pupil positioning method based on grayscale information is used, which is mainly divided into the following three stages:

[0029] The first s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human-computer interaction method based on eye movement control. The method comprises the following steps of human eye pupil positioning and eye movement characteristic extraction. The step of pupil center positioning based on grey information comprises the following stages that: 1: human eye positioning; 2: pupil edge detection; 3: pupil center positioning. The step of eye movement characteristic information extraction based on visual point movement comprises the following steps that: calculating a fixation point deviation and visual point deviation mapping; on the basis of the displacement difference of two obtained adjacent frames of images, enabling eyes to move back and forth to calibrate points calibrated by coordinates on the screen, and utilizing a least square curve fitting algorithm to solve a mapping function; and after eye movement characteristic information is obtained, on the basis of the obtained eye movement control displacement and angle information, carrying out a corresponding system message response. By use of the human-computer interaction method based on the eye movement control, the energy consumption of the user can be reduced, theuser can be assisted in carrying out direct control, and efficient and natural interaction is realized.

Description

technical field [0001] The invention belongs to the field of computer software, and in particular relates to a human-computer interaction method based on eye movement control for mobile terminal equipment. Background technique [0002] With the development of science and technology, the way of human-computer interaction is gradually transitioning from precise interaction to imprecise interaction. [0003] Precise interaction refers to the user's input of interactive information through precise interactive means, which are more common in daily life: 1) DOS-based operation instruction language input. 2) Precise input through devices with positioning functions such as mouse and keyboard. [0004] Due to the development of multi-channel interactive systems, people no longer focus on how to use one precise interaction technology to replace another precise interaction technology, but try to transform and develop non-precise interaction technologies. [0005] The imprecise intera...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/013G06V40/165G06V40/19
Inventor 蒋欣欣冯帆陈树峰
Owner BEIJING INST OF COMP TECH & APPL
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products