Wink action-based man-machine interaction method and system

A human-computer interaction and action technology, applied in the field of human-computer interaction, can solve the problems of human eye injury, human burden, increased cost, etc., and achieve the effect of simple and convenient operation, low cost, and easy implementation.

Active Publication Date: 2012-11-28
SHENZHEN INST OF ADVANCED TECH
View PDF3 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These methods have increased the cost to a certain extent, and are not suitable for implementation on ordinary mobile terminals.
Moreover, using infra

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Wink action-based man-machine interaction method and system
  • Wink action-based man-machine interaction method and system
  • Wink action-based man-machine interaction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0072] see figure 1 As shown, it is a flow chart of a blinking-based human-computer interaction method in the present invention. The method comprises the steps of:

[0073] Step S1: Acquire frame images.

[0074] In this step, the face image (image resolution is width×height) can be acquired in real time through the front camera of the mobile phone.

[0075] Step S2: Human eye region detection.

[0076] Considering that when using a mobile phone, the distance between the human eye and the camera is generally kept between 10 and 30 centimeters, and within this range the face will occupy the entire image area, so this method does not need the steps of face detection, and directly performs human detection. Eye area detection is enough. The initial positioning of the human eye area is not required to be very accurate, so there are many methods that can be used, such as histogram projection method, Haar (Haar) detection method, frame difference method, template matching method ...

Embodiment 2

[0131] Compared with Embodiment 1, Embodiment 2 is different in the specific implementation of step S4, and only the different parts will be described below, and other similar parts will not be repeated.

[0132] see Figure 6 As shown, it is a flow chart of eye tracking in the blinking-based human-computer interaction method according to Embodiment 2 of the present invention. After the rectangular search box is defined by initializing the search window, the rectangular search box may exceed the range of the image in the next image frame. At this time, the excess part needs to be filtered out to ensure that the search range does not exceed the image size. In this embodiment, after the next frame of image is acquired in step S402, step S403 is executed: determining whether the rectangular search box exceeds the range of the next frame of face image. When the rectangular search box exceeds the range of the next frame of human face image, perform step S404: filter out the part ...

Embodiment 3

[0134] Embodiment 3 is a blinking-based human-computer interaction method of the present invention applied to reading e-books. The specific implementation manner of sending the control command in the method is described below, and the implementation manners of Embodiment 1 or Embodiment 2 can be adopted for other steps not described.

[0135] see Figure 7 As shown, it is a flow chart of sending control commands in the method of Embodiment 3 of the present invention. The control command sending method comprises the following steps:

[0136] Step S501: Detect blinking action.

[0137] When the number of pupil centers changes, it can be judged that a blink has occurred.

[0138] Step S502: Determine whether a single eye blinks.

[0139] When it is detected that the number of pupil centers is 1, it means that a single eye is blinking, and it can be considered that it is preparing to issue a control command. If only one eye blinks, execute step S503; if not, end this process,...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of man-machine interaction and provides a wink action-based man-machine interaction method and a system, which enable a user to operate electronic equipment. The wink action-based man-machine interaction method comprises the following steps: acquiring a human face image through a camera; detecting human eye regions on the image, and locating the pupil center according to the detected human eye regions; tracking the position of the pupil center; and detecting wink actions, and sending a corresponding control command to the electronic equipment according to the detected wink actions. The invention further provides a wink action-based man-machine interaction system. Through the camera and eye detection technology provided in the invention, the wink actions are judged, the control command is sent through winks to operate the electronic equipment, so that the user operates the electronic equipment simply and conveniently.

Description

technical field [0001] The present invention relates to the technical field of human-computer interaction, in particular to a blinking-based human-computer interaction method and system. Background technique [0002] With the popularization of various mobile terminal devices such as mobile phones and tablet computers, especially the development of smart phones, the role of these smart mobile terminals is no longer simply to make and receive calls, send and receive text messages, and the applications of various mobile phones and tablet computers Flooding the entire network, the most important applications include: answering calls, sending and receiving text messages / emails, taking photos, browsing the web, playing games and reading e-books. The human-computer interaction between users and devices is also becoming more and more abundant. At present, there are mainly two kinds of human-computer interaction methods commonly used: one is the button type, which sends commands thr...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06F3/01G06K9/00
Inventor 宋展武照敏聂磊
Owner SHENZHEN INST OF ADVANCED TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products