Supercharge Your Innovation With Domain-Expert AI Agents!

Remote touch detection enabled by peripheral

A technology of electronic equipment and programs, applied in the field of computer-generated reality environment, which can solve cumbersome, unintuitive and other problems

Active Publication Date: 2022-04-01
APPLE INC
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, such techniques for interacting with virtual objects can be cumbersome and unintuitive for users

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Remote touch detection enabled by peripheral
  • Remote touch detection enabled by peripheral
  • Remote touch detection enabled by peripheral

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0091] Embodiment 1. A method comprising:

[0092] acquiring first image data about the input, wherein the first image data is acquired using one or more camera sensors of the first electronic device external to the second electronic device;

[0093] acquiring second image data about said input, wherein said second image data is acquired using one or more camera sensors of said second electronic device, said first electronic device being different from said second electronic device; as well as

[0094] Performing an operation based on the input based on a set of one or more criteria determined to be satisfied based on the first image data and the second image data, wherein the set of one or more criteria includes when the input is a touch Criteria met on entry.

Embodiment 2

[0095] Embodiment 2. The method according to embodiment 1, further comprising:

[0096] Performing the operation based on the input is aborted based on a determination that the one or more criteria are not met based on the first image data and the second image data.

Embodiment 3

[0097] Embodiment 3. The method according to any one of embodiments 1 to 2, further comprising:

[0098] Whether a suitable environment exists for receiving a touch input is determined based on orientation data from the second electronic device prior to acquiring the first image data and the second image data regarding the input.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This application relates to remote touch detection enabled by peripheral devices. This disclosure generally relates to remote touch detection. In some examples, the first electronic device obtains first image data and second image data about the input, and based on determining a group that satisfies one or more criteria based on the first image data and the second image data, based on The input to perform the action. In some examples, the first electronic device causes an infrared light source of the second electronic device to emit infrared light, acquires image data about the input, and determines based on the image data a set that satisfies one or more criteria, based on the input perform an action.

Description

technical field [0001] The present disclosure relates generally to computer-generated reality (CGR) environments, and more particularly to techniques for remote touch detection. Background technique [0002] A CGR environment is one in which some of the objects displayed for viewing by a user are computer-generated. Users can interact with these virtual objects by activating hardware buttons or touching touch-enabled hardware. However, such techniques for interacting with virtual objects can be cumbersome and non-intuitive for users. Contents of the invention [0003] Described herein are techniques for implementing remote touch detection using a system of multiple devices, including peripherals placed on a physical surface, such as a table top. With these technologies, users can interact with virtual objects by touching them on physical surfaces. [0004] In some embodiments, a method is described. The method comprises: at the first electronic device: obtaining first ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06F3/042G06V40/10
CPCG06F3/017G06F3/0421G06F3/04845G06F2203/04808G06F2203/04806G06F3/04883G06F3/04886G06F3/0425G06F3/042G06F3/0304G06F3/0488G06F3/044G06F3/0426
Inventor S·L·埃格勒斯亚斯D·W·查尔默斯R·塞西王乐晶
Owner APPLE INC
Features
  • R&D
  • Intellectual Property
  • Life Sciences
  • Materials
  • Tech Scout
Why Patsnap Eureka
  • Unparalleled Data Quality
  • Higher Quality Content
  • 60% Fewer Hallucinations
Social media
Patsnap Eureka Blog
Learn More