Method and system for vision-based interaction in a virtual environment

a virtual environment and interaction technology, applied in the field of human-machine interfaces, can solve the problems of limited user experience in a virtual environment, lack of realism, and limited user experien

Active Publication Date: 2012-06-28
MICROSOFT TECH LICENSING LLC
View PDF7 Cites 26 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The user's experience in a virtual environment, therefore, is limited by a lack of realism and a lack of physical feedback from the virtual environment, as well as a lack of natural means for interaction,
However, the user's experience is limited by the requirement of wearing feedback devices.
Virtual reality using only visual feedback has limitations of its own.
However, peripheral devices are still required to interact with the virtual world.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for vision-based interaction in a virtual environment
  • Method and system for vision-based interaction in a virtual environment
  • Method and system for vision-based interaction in a virtual environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024]A method and system for vision-based interaction in a virtual environment is disclosed. According to one embodiment, a computer-implemented method comprises receiving data from a plurality of sensors to generate a meshed volumetric three-dimensional representation of a subject. A plurality of clusters is identified within the meshed volumetric three-dimensional representation that corresponds to motion features. The motion features include hands, feet, knees, elbows, head, and shoulders. The plurality of sensors is used to track motion of the subject and manipulate the motion features of the meshed volumetric three-dimensional representation.

[0025]Each of the features and teachings disclosed herein can be utilized separately or in conjunction with other features and teachings to provide a method and system for vision-based interaction in a virtual environment. Representative examples utilizing many of these additional features and teachings, both separately and combination, ar...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Method, computer program and system for tracking movement of a subject. The method includes receiving data from a plurality of fixed position sensors comprising a distributed network of time of flight camera sensors to generate a volumetric three-dimensional representation of the subject, identifying a plurality of clusters within the volumetric three-dimensional representation that correspond to features indicative of motion of the subject relative to the fixed position sensors and one or more other portions of the subject, and presenting one or more objects on one or more three dimensional display screens. The plurality of fixed position sensors are used to track motion of the features of the subject to manipulate the volumetric three-dimensional representation to determine interaction of one or more of the features of the subject and one or more of the one or more objects on one or more of the one or more three dimensional display screens.

Description

CROSS REFERENCE TO RELATED APPLICATIONS[0001]The present application is a continuation of U.S. patent application Ser. No. 12 / 028,704, filed Feb. 8, 2008 to El Dokor et al., entitled “Method and System for Vision-Based Interaction in a Virtual Environment.” This '704 application in turn claims the benefit of and priority to U.S. Provisional Patent Application No. 60 / 899,971 filed on Feb. 8, 2007, entitled “Natural Interaction in Cyberspace,” Application No. 60 / 901,548 filed on Feb. 16, 2007, entitled “Naturally Interactive Environments” and Application No. 60 / 966,056 filed on Aug. 27, 2007, entitled “Multi-Player Vision-Based Aerobic Gaming Controls.” U.S. Provisional Patent Applications No. 60 / 899,971, No. 60 / 901,548 and No. 60 / 966,056. The entire contents of each of these applications are hereby incorporated by reference.FIELD OF THE INVENTION[0002]The field of the invention relates generally to the field of human-machine interfaces and more particularly relates to a method and sy...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): H04N13/02
CPCA63F2300/1093G06F3/011A63F13/00A63F2300/6607A63F13/06A63F2300/1018G06T15/00E05Y2900/132A63F13/22A63F13/213A63F2300/1087A63F13/79A63F13/428A63F13/56A63F2300/208
Inventor EL DOKOR, TAREKKING, JOSHUA E.HOLMES, JAMES E.GIGLIOTTI, JUSTIN R.GLOMSKI, WILLIAM E.
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products