Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine interaction method of mobile device and mobile device

A mobile device and human-computer interaction technology, applied in the field of human-computer interaction, can solve problems such as lag, low accuracy and flexibility, and failure to achieve human-computer interaction experience, so as to improve operating efficiency and convenience, and improve human-computer interaction. Accurate and reliable, the effect of good human-computer interaction experience

Inactive Publication Date: 2017-12-15
湖州靖源信息技术有限公司
View PDF4 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This human-computer interaction method only uses the human eye line of sight as the input information. In actual application, the response of electronic equipment based on the input information is often less accurate and flexible than human eye movements. , and the lag is relatively obvious, so it cannot achieve a good human-computer interaction experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction method of mobile device and mobile device
  • Man-machine interaction method of mobile device and mobile device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0026] like figure 1 As shown, a human-computer interaction method for a mobile device includes the following steps:

[0027] a. Turn on the power, the distance sensor works, and judge the distance of the human body;

[0028] b. The eye tracking sensor works, records the initial eye image and stores it in the memory, and the timer starts counting;

[0029] c. Complete the preset timing cycle, record the eyeball image again and store it in the memory, and continue to work after the timer is cleared;

[0030] d. The controller compares and analyzes the eyeball images in the memory to determine whether the eyeballs have changed;

[0031] e. Judging the distance and direction of eye movement;

[0032] f. The controller instructs the display to make corresponding actions.

[0033] The distance sensor presets the healthy browsing distance between the human body and the mobile device. The healthy browsing distance is 35 to 45 centimeters, and the actual distance can be adjusted a...

Embodiment 2

[0041] like figure 2 As shown, a mobile device includes a power supply, a distance sensor, an eye tracking sensor, a controller, a timer, a memory and a display.

[0042] The mobile device is working normally, and the distance sensor judges the healthy browsing distance between the human body and the mobile device. When it is within 35 to 45, the eye tracking sensor is activated and starts to work, and records the initial eyeball image and stores it in the memory. Simultaneously timing timer starts counting. After completing a timing cycle of 0.1 to 1 second, usually set to 0.2 seconds, the eye tracking sensor records the eyeball image again and stores it in the memory, and the timer continues to work after being cleared. The controller compares and analyzes the two eyeball images stored in the memory, and judges whether the eyeball changes. If there is a change, judges the distance and direction of the eyeball movement, and compares it with the preset judgment threshold for...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a man-machine interaction method of a mobile device. The method comprises the steps that a, a power supply is started, and a distance sensor works and judges the distance of the human body; b, an eyeball tracking sensor works, records initial images of eyeballs and stores the images into a storage, and a timer starts to conduct timing; c, a preset time period is completed, images of the eyeballs are recorded again and stored in the storage, and the work of the timer continues after the timer is reset; d, a controller compares and analyzes the images of the eyeballs in the storage and judges whether or not the eyeballs change; e, the movement distances and directions of the eyeballs are judged; f, the controller instructs a displayer to make corresponding movements. The invention further provides the mobile device. According to the man-machine interaction method of the mobile device and the mobile device, man-machine interaction, such as turning pages and switching songs, can be achieved through eyeball movements of human, the operating efficiency and convenience of users are improved, meanwhile the man-machine interaction is more accurate and reliable, and the mobile device brings good man-machine interaction experience to the users.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, in particular to a human-computer interaction method of a mobile device and the mobile device. Background technique [0002] At present, various smart mobile devices are becoming more and more popular, and the ways of human-computer interaction are also constantly developing. The way modern computer systems realize human-computer interaction mainly relies on interactive devices such as microphones, keyboards, mice, stylus pens, sensor bars, and touch screens. The above methods usually require manual or repeated operations, and some require training in advance. Proficiency. [0003] In the existing technologies, there are existing electronic devices using human eyes as an input device to achieve a better human-computer interaction experience. But at present, this input method mostly detects that the human eye's line of sight is currently focused on a certain position o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01
CPCG06F3/013
Inventor 王淑琴
Owner 湖州靖源信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products