Line of sight estimation method based on generative adversarial network

A line-of-sight estimation and line-of-sight direction technology, applied in the field of line-of-sight estimation, can solve problems such as difficulty in accurately judging the line-of-sight direction

Inactive Publication Date: 2018-07-27
SHENZHEN WEITESHI TECH
View PDF2 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at problems such as easy to be blocked by eyelids and eyelashes, and difficult to accurately judge the direction of sight, the purpose of the present invention is to provide a sight estimation method based on a generative confrontation network. Alignment, followed by an unpaired pixel-level domain adaptation technique to map the synthetic image to the real domain, pre-training a gaze direction estimator using gaze direction annotations and synthetic data, and finally performing a refinement network throughout the mapping process to preserve gaze direction, Using a pre-trained network as a constraint for the transformation cycle from synthetic to real to synthetic

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Line of sight estimation method based on generative adversarial network
  • Line of sight estimation method based on generative adversarial network
  • Line of sight estimation method based on generative adversarial network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0030] It should be noted that, in the case of no conflict, the embodiments in the present application and the features in the embodiments can be combined with each other. The present invention will be further described in detail below in conjunction with the drawings and specific embodiments.

[0031] figure 1 It is a system flowchart of a line-of-sight estimation method based on a generative confrontation network in the present invention. It mainly includes generating textures, generating real data and refining the eyes.

[0032] The sight line estimation method, first, the simulator forms a 3D scene of the eye area according to the specified lighting conditions, line of sight direction and skin; in order to extend the limited diversity of the original principal component-based texture model, the face image is automatically combined with the UV (horizontal direction and vertical direction) texture space alignment, enabling the rendering of an image of the eye region with an...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention puts forward a line of sight estimation method based on a generative adversarial network. The method comprises the following main contents of generating a texture, generating real data and refining eyes. The method comprises the following processes that: firstly, automatically aligning a face image with the texture space of the horizontal direction and the vertical direction of a 3Dmodel; then, mapping a synthesis image to a true domain by an unpaired pixel level domain adaptive technology; thirdly, using the annotation and synthesis data of a line of sight direction to pre-train a line of sight direction estimator; and finally, in a whole mapping process, executing a refined network to keep a line of sight direction, and using a pre-training network to serve as a conversioncirculation constraint from synthesis to truth to synthesis. By use of the method, a novel adversarial training method is used, the rendered synthesis image is mapped to a vivid domain, and accurateline of sight estimation can be obtained on a practical image without using any piece of additional flag data from a true user. For the situations of extreme head gesture, blur, long distance and thelike, the method can generate line of sight estimation with robustness.

Description

technical field [0001] The invention relates to the field of line of sight estimation, in particular to a line of sight estimation method based on a generation confrontation network. Background technique [0002] Estimation of human eye gaze direction is an important branch of human-computer interaction technology. It mainly studies the detection and recognition of human eye movement characteristics. In addition to inferring the target object or area that people pay attention to, it can further analyze and study people's psychological activities. Eye movement can be used to control external devices and systems. In the application of the traffic field, it can detect the driver's fatigue state and judge whether the driver is driving in fatigue; if the driver's eyes keep staring at a certain direction, it means that he is likely to be distracted or may cause distraction And a traffic accident happened. In the military and aviation fields, people can also control external equi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06F3/01
CPCG06F3/013G06V40/171G06V40/18G06F18/214
Inventor 夏春秋
Owner SHENZHEN WEITESHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products