Control instruction generation method based on face recognition and electronic equipment

A technology of control commands and normalization, applied in the field of image processing, can solve the problems of single operation mode, inability to meet the needs of users for diversified and more interesting operation modes, and inability to meet the needs of mobile terminal applications, etc.

Inactive Publication Date: 2017-06-30
PALMWIN INFORMATION TECH SHANGHAI
View PDF6 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the existing somatosensory game technology, the game operation often requires a large activity space for the movement of the whole body. At the same time, in order to achieve better game effects, this type of game is suitable for playing on devices equipped with larger screens. , can not meet the application in mobile terminals; in addition, the traditional operation mode through keyboard, mouse control and finger pointing and sliding is relatively simple, which cannot meet the needs of users for diversified and more interesting operation modes.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Control instruction generation method based on face recognition and electronic equipment
  • Control instruction generation method based on face recognition and electronic equipment
  • Control instruction generation method based on face recognition and electronic equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0106] An embodiment of the present invention provides a method for generating control instructions based on face recognition, referring to figure 1 As shown, the method flow includes:

[0107] 101. Acquire a current video frame, and detect a human face in the current video frame.

[0108] 102. Acquire multiple two-dimensional feature point coordinates of the human face according to the detected human face.

[0109] 103. According to the coordinates of multiple two-dimensional feature points of the human face, corresponding control instruction parameters are acquired.

[0110] Wherein, the control instruction parameters include the rotation angle of the human face and / or the distance between the upper lip and the lower lip of the human face.

[0111] Specifically, obtaining the rotation angle of the face includes:

[0112] Obtaining a pose estimation matrix of a face relative to a three-dimensional template face, where the three-dimensional template face is a frontal three-...

Embodiment 2

[0135] An embodiment of the present invention provides a method for generating a control instruction based on face recognition. In this embodiment, the control instruction parameters include the rotation angle of the face and the distance between the upper lip and the lower lip of the face. The rotation angle and the distance between the upper lip and lower lip of the face, control the direction and speed of the control target object, refer to figure 2 As shown, the method flow includes:

[0136] 201. Acquire a current video frame, and detect a human face in the current video frame.

[0137] Specifically, the current video frame may be acquired through the camera according to a preset video frame acquisition instruction, and the embodiment of the present invention does not limit the specific manner of acquiring the current real-time video frame.

[0138] The method for detecting faces can be through traditional feature-based face detection methods, statistics-based face dete...

Embodiment 3

[0237] An embodiment of the present invention provides a method for generating a control command based on face recognition. In this embodiment, the control command parameter includes the rotation angle of the face, and the direction of the target object is controlled through the rotation angle of the face. Refer to Figure 5 As shown, the method flow includes:

[0238] 501. Acquire a current video frame, and detect a human face in the current video frame.

[0239] Specifically, this step is the same as step 201 in the second embodiment, and will not be repeated here.

[0240] 502. According to the detected human face, obtain partial feature point coordinates among multiple two-dimensional feature point coordinates of the human face.

[0241] Specifically, the manner of obtaining the coordinates of the feature points in this step is the same as that of step 202 in the second embodiment, and will not be repeated here.

[0242] Preferably, some of the acquired two-dimensional f...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a control instruction generation method based on face recognition and electronic equipment, and belongs to the field of image processing. The method comprises the steps that a current video frame is acquired, and a face in the current video frame is detected; multiple two-dimensional feature point coordinates of the face are acquired according to the detected face; corresponding control instruction parameters are acquired according to the two-dimensional feature point coordinates of the face, wherein the control instruction parameters comprise the face rotating angle and / or the distance between the upper lip and the lower lip of the face; the direction and / or the speed of a target object is controlled according to the control instruction parameters. The direction and / or the speed of the target object is controlled according to the control instruction parameters, and the control instruction parameters are acquired on the basis of face detection according to the two-dimensional feature point coordinates of the face, so that the purpose of controlling the target object through face actions is achieved; operation is easy and convenient, the operation modes of conducting controlling over the target object are increased, the user experience is improved, and the user requirements are met.

Description

technical field [0001] The invention relates to the field of image processing, in particular to a method for generating control instructions based on face recognition and electronic equipment. Background technique [0002] With the popularization of face recognition in computer technology, the technology of face recognition by computer has been applied in various fields. Now, the main operation mode of games on desktop computers and notebooks is still through mouse, keyboard, mobile application or The main operation mode of mobile games is to tap and slide with fingers. In order to enhance the player's gaming experience and change the traditional keyboard, mouse control and finger tap and slide operations, it is necessary to provide a way to control the game through face recognition. The manner of the target object. [0003] In existing somatosensory games, users can operate game objects through body movements, which are mainly accomplished by recognizing human body movemen...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00
CPCG06F3/011G06F2203/011G06V40/161G06V40/171
Inventor 周世威
Owner PALMWIN INFORMATION TECH SHANGHAI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products