Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine interaction method and system based on sight judgment

A technology of human-computer interaction and line of sight, applied in the field of human-computer interaction, can solve the problems of human eye injury, increased cost, human body burden, etc., and achieve the effect of simple and convenient operation, easy implementation and low cost

Active Publication Date: 2012-12-19
SHENZHEN INST OF ADVANCED TECH
View PDF6 Cites 99 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These methods have increased the cost to a certain extent, and are not suitable for implementation on ordinary mobile terminals.
Moreover, using infrared light sources for a long time will damage human eyes, and if additional equipment is fixed on the head or eyes, it will also burden the human body

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction method and system based on sight judgment
  • Man-machine interaction method and system based on sight judgment
  • Man-machine interaction method and system based on sight judgment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0074] see figure 1 As shown, it is a flow chart of a human-computer interaction method based on line-of-sight judgment in the present invention. The method comprises the steps of:

[0075] Step S1: Acquire frame images.

[0076] In this step, the face image can be obtained in real time through the front camera of the mobile phone.

[0077] Step S2: Human eye region detection.

[0078] Considering that when using a mobile phone, the distance between the human eye and the camera is generally kept between 10 and 30 centimeters, and within this range the face will occupy the entire image area, so this method does not need the steps of face detection, and directly performs human detection. Eye area detection is enough. The initial positioning of the human eye area is not required to be very accurate, so there are many methods that can be used, such as histogram projection method, Haar (Haar) detection method, frame difference method, template matching method and other methods ...

Embodiment 2

[0170] This embodiment provides a human-computer interaction method based on line of sight judgment. Steps S1 to S6 of this method are the same as those in Embodiment 1, and will not be described in detail here. This method is different from Embodiment 1 in the specific implementation of the step S6 control command sending. What Embodiment 1 adopted is the blink control method. What this embodiment adopts is the closed-eye control method. There is a single eye closing and the closing time; when there is a single eye closing, according to the comparison relationship between the preset eye closing time and the control command, the corresponding control command is sent to the electronic device. The detailed implementation steps of sending the control command in the method will be described below.

[0171] see Figure 11 Shown is another detailed flow chart of the control command transmission of the present invention. The detailed steps of the control command sending method incl...

Embodiment 3

[0182] This embodiment provides a human-computer interaction system based on line of sight judgment. see Figure 12 As shown, it is a schematic diagram of a user-operated human-computer interaction system based on line-of-sight judgment according to Embodiment 3 of the present invention. The system provided by Embodiment 3 is used to realize the non-contact operation of the mobile phone by the user. The system includes a mobile phone 2 and a camera 21. The mobile phone 2 has a screen 22. The camera 21 is a front camera of the mobile phone 2. The method of Example 1 or Example 2 is used for human-computer interaction.

[0183] The direction shown by the arrow in the figure is the line of sight of the human eye. The front camera 21 of the mobile phone 2 is used to capture the head picture of the user 1 in real time. The coordinates of the pupil centers of the eyes, determine the corresponding relationship between the coordinates of the pupil centers of the two eyes and the x-y...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of man-machine interaction and provides a man-machine interaction method based on sight judgment, to realize the operation on an electronic device by a user. The method comprises the following steps of: obtaining a facial image through a camera, carrying out human eye area detection on the image, and positioning a pupil center according to the detected human eye area; calculating a corresponding relationship between an image coordinate and an electronic device screen coordinate system; tracking the position of the pupil center, and calculating a view point coordinate of the human eye on an electronic device screen according to the corresponding relationship; and detecting an eye blinking action or an eye closure action, and issuing corresponding control orders to the electronic device according to the detected eye blinking action or the eye closure action. The invention further provides a man-machine interaction system based on sight judgment. With the adoption of the man-machine interaction method, the stable sight focus judgment on the electronic device is realized through the camera, and control orders are issued through eye blinking or eye closure, so that the operation on the electronic device by the user becomes simple and convenient.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, in particular to a method and system for human-computer interaction based on line-of-sight judgment. Background technique [0002] With the popularization of various mobile terminal devices such as mobile phones and tablet computers, human-computer interaction methods are becoming more and more abundant. At present, there are mainly two kinds of human-computer interaction methods commonly used: one is the button type, which sends commands through the buttons; the other is the touch type, the touch screen adopts a capacitive screen or a resistive screen, and the user sends commands by touching the screen with fingers. These two methods are human-computer interaction methods based on human hands, which require the intervention of hands. When both hands are occupied, the interaction between human and equipment cannot be completed, so they cannot be applied to some special ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
Inventor 宋展武照敏聂磊
Owner SHENZHEN INST OF ADVANCED TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products