Positioning method and handle

A positioning method and handle technology, applied in the field of virtual reality, can solve the problems of insignificant changes in brightness, inability to identify effective codes, and codes are difficult to identify effectively, and achieve the effect of improving accuracy

Inactive Publication Date: 2017-05-10
LETV HLDG BEIJING CO LTD +1
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, in the process of realizing the present invention, the inventors have found that at least the following problems exist in the prior art: in the prior art, the coding is usually analyzed frame by frame and compared one by one, but this method may cause problems due to the The brightness change is not obvious, so that it cannot be extracted as an effective code, and it cannot be recognized as an effective code, and the monocular stereo vision based on the Hamming code (or other codes) will change the brightness of the LED due to the influence of the angle or speed of movement. Causes the code to be difficult to effectively identify, which makes it impossible to accurately determine the position of the handle

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Positioning method and handle
  • Positioning method and handle
  • Positioning method and handle

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] In order to make the object, technical solution and advantages of the present invention clearer, various embodiments of the present invention will be described in detail below in conjunction with the accompanying drawings. However, those of ordinary skill in the art can understand that, in each implementation manner of the present invention, many technical details are provided for readers to better understand the present application. However, even without these technical details and various changes and modifications based on the following implementation modes, the technical solution claimed in this application can also be realized.

[0020] The first embodiment of the present invention relates to a positioning method, the specific process is as follows figure 1 shown.

[0021] In step 101, the identification code corresponding to each lamp in the lamp group is acquired.

[0022] Specifically, the positioning method in this embodiment is mainly applied to an undetermin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

An embodiment of the invention relates to the technical field of virtual reality and discloses a positioning method and a handle. In the invention, a method for positioning a lamp group-equipped object to be positioned comprises the following steps: identification codes corresponding to all lamps in a lamp group are obtained; position information of all lamps in the lamp group is determined according to all the identification codes; when position information of one lamp is determined, N prestored models are tested via an identification code, and a model corresponding to the identification code is determined; according to position information which corresponding to the models in a one-to-one manner, the position information of the lamp corresponding to the identification code is determined, N is a natural number greater than one, and a position of the object to be positioned can be determined via three dimensional attitude estimation based on position information of all the lamps. Via the positioning method and the handle provided in the invention, a problem that codes are difficult to identify due to luminance codes which are not obvious can be solved, and effects of one side stereoscopic visual sense can be exerted.

Description

technical field [0001] The embodiment of the present invention relates to the field of virtual reality technology, and in particular to a positioning method and a handle. Background technique [0002] Virtual reality (Virtual Reality, referred to as VR) technology is a new comprehensive information technology emerging at the end of the 20th century. It is an advanced and digital human-machine interface technology. The operating environment and the immersive experience brought to people will fundamentally change the boring, blunt and passive status quo between people and computers, and create a new research field for human-computer interaction technology. With the development of VR technology Development, the handle used to realize the exchange with virtual goods has also received extensive attention. [0003] In the prior art, the interaction between the handle and the virtual object is mainly realized by multiplexing the ordinary color camera on the VR glasses with the spe...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/70G06K9/62G06F3/01
CPCG06F3/011G06T2207/20081G06T2207/10004G06F18/214G06F18/2431G06F18/2411
Inventor 张超
Owner LETV HLDG BEIJING CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products