Tracking and positioning method and device for three-dimensional display

A tracking positioning and stereoscopic display technology, which is applied in stereoscopic systems, image communications, electrical components, etc., can solve problems such as inaccurate tracking and positioning, crosstalk, etc., and achieve the effect of improving crosstalk and optimizing stereoscopic display effects

Inactive Publication Date: 2018-10-02
SUPERD CO LTD
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The purpose of the present invention is to provide a tracking and positioning method and device for stereoscopic display, aiming to solve the problem of inaccurate tracking and positioning when the viewer tilts his head to watch the stereoscopic display image in the prior art, resulting in crosstalk

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Tracking and positioning method and device for three-dimensional display
  • Tracking and positioning method and device for three-dimensional display
  • Tracking and positioning method and device for three-dimensional display

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0024] See figure 1 , figure 1 A schematic flowchart of a tracking and positioning method for stereoscopic display according to Embodiment 1 of the present invention is shown. The tracking and positioning method for stereoscopic display according to Embodiment 1 of the present invention is characterized in that the tracking and positioning method for stereoscopic display includes the following steps:

[0025] S1 setting at least a first auxiliary positioning point and a second auxiliary positioning point corresponding to the position of the viewer's eyes;

[0026] S2 acquiring the position information of the first auxiliary positioning point and the position information of the second auxiliary positioning point in real time;

[0027] S3 Obtain the position information of the center point of the eyes of the viewer according to the position information of the first auxiliary positioning point and the position information of the second auxiliary positioning point, wherein the c...

Embodiment 1

[0030] When the first auxiliary positioning point and the second auxiliary positioning point are infrared signal emitting units arranged on the head of the viewer, the step S2 specifically includes:

[0031] S21 Receive in real time the first infrared signal and the second infrared signal sent simultaneously by the first auxiliary positioning point and the second auxiliary positioning point;

[0032] S22 Obtain the spatial coordinate information of the first auxiliary positioning point and the spatial coordinate information of the second auxiliary positioning point according to the first infrared signal and the second infrared signal.

[0033] If the position of the center point of the eyes is used as the input of the mapping algorithm, the coordinates of the calculated tracking position need to be converted into the coordinates of the center of the eyes (nasion point).

[0034] See figure 2 and image 3 , figure 2 A schematic diagram showing the relationship between the ...

Embodiment 2

[0039] When the first auxiliary positioning point and the second auxiliary positioning point are feature points of the viewer's face, the step S2 specifically includes:

[0040] S210 Recognizing the face of the viewer in real time, extracting at least two feature points of the face of the viewer, obtaining the positional relationship between each feature point and the eyes of the viewer, and arbitrarily selecting two feature points as the first auxiliary positioning point and the second auxiliary positioning point;

[0041] S220 respectively locate the feature point serving as the first auxiliary positioning point and the feature point serving as the second auxiliary positioning point, and obtain the spatial coordinate information of the first auxiliary positioning point and the spatial coordinate information of the second auxiliary positioning point. Spatial coordinate information.

[0042] Specifically, the feature points extracted by general face recognition algorithms are...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention belongs to the technical field of three-dimensional display, and provides a tracking and positioning method and device for three-dimensional display. The tracking and positioning methodcomprises the steps of s1, arranging at least a first auxiliary positioning point and a second auxiliary positioning point at the positions, corresponding to the eyes of a viewer; s2, acquiring the position information of the first auxiliary positioning point and the position information of the second auxiliary positioning point in real time; and s3, acquiring the position information of the center point of the two eyes of the viewer according to the position information of the first auxiliary positioning point and the position information of the second auxiliary positioning point, wherein thecenter point of the two eyes of the viewer is the center symmetry point of the left eye and the right eye of the viewer, and taking the position information of the center point of the two eyes of theviewer as the tracking and positioning information during the viewer viewing the three-dimensional display with both eyes. Through the tracking and positioning method and device, the positioning information can be accurately acquired when a viewer is watching the three-dimensional display image with sidelong glance. The crosstalk phenomenon in the process of watching a three-dimensional display image by a viewer with sidelong glance is effectively improved, and the three-dimensional display effect is optimized.

Description

technical field [0001] The present invention relates to the technical field of stereoscopic display, in particular to a tracking and positioning method and device for stereoscopic display. Background technique [0002] In recent years, stereoscopic display technology has developed rapidly and has become a research hotspot. Stereoscopic display technology has been more and more widely used in various fields such as medical treatment, advertising, military, exhibition, game and vehicle display. Stereoscopic display technology includes glasses-wearing stereoscopic display technology and glasses-free stereoscopic display technology. Among them, the glasses-wearing stereoscopic display technology has been developed very early, and is currently relatively mature, and is still used in many fields; while the naked-eye stereoscopic display technology started relatively late, and its technical difficulty is higher than that of wearing glasses. There are applications in related field...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): H04N13/167H04N13/302
Inventor 叶磊李统福李焘然韩周迎乔梦阳周峰赵兴海
Owner SUPERD CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products