Image recognition method based on eye movement fixation point guidance, MR glasses and medium

An image recognition and gaze point technology, applied in the field of image recognition, can solve problems such as poor interactive experience, failure to provide users with better interactive experience, less virtual holographic target recognition, etc.

Pending Publication Date: 2021-03-16
幻蝎科技(武汉)有限公司
View PDF4 Cites 23 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the above-mentioned methods exist: 1. Privacy and security risks
2. The program consumes a lot of power
3. It is impossible to discern the real intention of the user's eyes on the object they are looking at
4. Unable to accurately frame the target object of interest
[0010] (1) The existing method of eye tracking devices using MR / AR glasses to identify the gaze area image has privacy and security risks, and the program consumes a lot of power. It is impossible to distinguish the real intention of the user's eyes on the gaze object, and it is impossible to accurately frame and select the object of interest. target object
[0011] (2) In the specific application service, the color front camera needs to be turned on all the time, which may shoot and store other people's images, which will infringe on other people's privacy and personal portrait rights, and this type of MR glasses application causes Privacy security creates public backlash
[0012] (3) Generally, the three-dimensional information perception function of MR / AR glasses needs to be activated all the time. Then when the above object detection function is activated, the color camera needs to be activated to obtain images, which is equivalent to simultaneously enabling both color and infrared cameras. This will cause huge power consumption of AR / MR glasses, and at the same time, the system / application of AR / MR glasses will be stuck, further resulting in poor experience of MR / AR glasses
[0013] (4) The existing method of intercepting partial images in the geometrical position of the point of gaze when the user generates an "interesting behavior", there is a delay in the camera screenshot; the screenshot is not accurate, and it is easy to divide the image of a target object into two, thus causing the entire The user experience is not good, the program is slow, the image quality is not good, the image recognition is not accurate, and the interactive experience is not good
[0014] (5) At present, there are very few methods for identifying virtual holographic targets in public patents at home and abroad; at the same time, in the existing local image recognition methods based on AR smart glasses, the mechanism conditions are somewhat rigid and too absolute, and cannot be used for Provide users with a better interactive experience
[0021] (3), currently in the public patents at home and abroad, provide a variety of methods for identifying objects in the physical world through eye tracking guidance, but there are very few methods for identifying virtual holographic targets. What MR glasses will present in the future is A world where the real and the virtual are superimposed and mixed, holographic objects tend to attract the user's attention more
[0023] (4), in "A Partial Image Recognition Method Based on AR Smart Glasses" CN 109086726, start the partial image recognition program, start the image partial image recognition program when human bioelectricity is obtained and identify the user's point of interest, the interest recognition condition Including: A. The duration of gazing at a certain area exceeds the threshold; B. The number of times of returning gaze to a certain area exceeds the preset number; C. The number of blinks or blinking behavior when gazing at a certain area reaches the preset standard; D. Things produce a regular visual attention model; E, the brain wave detection module component detects that the user is interested in the visual fixation area at the same time; F, the heart rate, blood pressure human biological information monitoring module component detects the user's emotional change data at the same time; G At the same time, the human eye produces the physiological response data of pupil dilation or narrowing to the things that are currently being watched; however, the conditions of this mechanism are somewhat rigid and too absolute, and cannot provide users with a better interactive experience

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Image recognition method based on eye movement fixation point guidance, MR glasses and medium
  • Image recognition method based on eye movement fixation point guidance, MR glasses and medium
  • Image recognition method based on eye movement fixation point guidance, MR glasses and medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0276] Example 1: as Figure 4 As shown, the infrared + color camera is mixed to obtain the image

[0277] S101: The physical world is constructed into a three-dimensional space through the infrared camera of the MR glasses, and the real black and white image is captured by the infrared camera in real time.

[0278] S102: The eye tracking device of the MR glasses obtains the gaze direction of the user's line of sight or the head movement tracking device obtains the gaze point at the center of the user's field of vision, and obtains the user's gaze point / gaze in one or more front camera images and in the holographic space through a mapping algorithm point coordinate position.

[0279] S103: The local processor of the MR glasses and the local database perform AI image analysis on the black and white images captured by the mid-infrared camera in S101, identify at least one object in the image using the trained object feature library, and adaptively frame the target object in the...

Embodiment 2

[0343] Embodiment 2: The IR camera and the RGB camera are mixed to obtain a real scene image, and the scene analysis and behavior analysis predict the target object that the user is interested in and image recognition.

[0344] S201: The physical world is constructed into a three-dimensional space through the infrared camera of the MR glasses, and the real-time black and white image is captured by the infrared camera in real time.

[0345] S202: The eye tracking device of the MR glasses obtains the gaze direction of the user's line of sight, or the head movement tracking device obtains the gaze point at the center of the user's field of vision, and obtains the user's gaze point / gaze in one or more front camera images and in the holographic space through a mapping algorithm point coordinate position.

[0346] S203: Detect objects and sounds in the scene, the local processor of the MR glasses and the local database perform AI image analysis on the black and white images captured...

Embodiment 3

[0380] Embodiment 3: The IR camera and the RGB camera are mixed to obtain a real scene image, and the eye movement interaction intention is used to predict the target object that the user is interested in and image recognition.

[0381] S301: The physical world is constructed into a three-dimensional space through the infrared camera of the MR glasses, and the real-time black and white image is captured by the infrared camera in real time.

[0382] S302: The eye tracking device of the MR glasses obtains the gaze direction of the user's gaze or the head movement tracking device obtains the gaze point at the center of the user's field of vision, and obtains the user's gaze point / gaze in one or more front camera images and in the holographic space through a mapping algorithm point coordinate position.

[0383] S303: The local processor of the MR glasses and the local database perform AI image analysis on the black and white images captured by the mid-infrared camera in S301, iden...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of image recognition, and discloses an image recognition method based on eye movement fixation point guidance, MR glasses and a medium, and the method comprises the steps of carrying out the infrared and color camera mixing to obtain an image; mixing the IR camera and the RGB camera to obtain a live-action image, and mixing the IR camera and the RGB camera to obtain a live-action image; mixing the low-resolution camera and the high-resolution camera to obtain a live-action image and recognize the live-action image; mixing the physical camera and thevirtual camera to obtain an image of a real/virtual target and identifying the image; calculating the interest degree by detecting behaviors and physiological data of a user, and then starting a camera to obtain an exterior image and recognize the exterior image; in addition, in the process of acquiring the image of the real/virtual target by mixing the physical camera and the virtual camera, thecamera is started by detecting the behavior and physiological data of the user to calculate the interest degree, and then the physical camera or the virtual camera is selected to be used for acquiring and recognizing the image according to the position and/or depth of the fixation point.

Description

technical field [0001] The invention belongs to the technical field of image recognition, and in particular relates to an image recognition method, MR glasses and a medium based on eye movement gaze point guidance. Background technique [0002] At present, with the rapid development of the AR / VR industry in recent years, we are concerned that AR smart glasses may be the next generation of smart terminals that will eliminate smartphones, so the app on AR smart glasses is just like on smartphones when they were just emerging. state, there is a huge market value. We understand that eye tracking technology is an important human-computer interaction method for AR smart glasses in the future. Based on such an environment, we think that there may be potential demand for future applications, and huge market value will explode in the future. [0003] AR / MR / XR glasses have privacy and security risks. In the current advertisement / content recommendation technology of MR / AR / VR smart gla...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/32G06F3/01G06T17/00G06T19/00
CPCG06F3/013G06T19/006G06T17/00G06V20/64G06V10/25
Inventor 陈涛朱若晴
Owner 幻蝎科技(武汉)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products