Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

User Interface Gestures For Moving a Virtual Camera On A Mobile Device

a virtual camera and user interface technology, applied in the field of navigation in a three-dimensional environment, can solve the problems of changing orientation, virtual camera moving backwards, and double tapping a view with a finger being imprecis

Inactive Publication Date: 2010-02-25
GOOGLE LLC
View PDF25 Cites 88 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

This patent describes a method and system for navigating a virtual camera in a three-dimensional environment on a mobile device using user interface gestures. The system can detect the movement of two objects on the touch screen of the device, and use this information to determine the speed and direction of the virtual camera in the environment. The virtual camera can then be moved and rotated based on the user's touch input. The technical effects of this invention include improved user experience and more intuitive interaction with virtual cameras on mobile devices.

Problems solved by technology

Further, user input may cause virtual camera 202 to change orientation, such as pitch, yaw, or roll.
Moving the fingers towards each other may cause the virtual camera to move forward, whereas moving the fingers away from each other may cause the virtual camera to move backwards.
While being easy for user, double tapping a view with a finger can be imprecise.
As result, a finger touch may occupy a substantial portion of the view.
Small changes in the position of the wide finger may result in large changes in the target location.
As result, angular jump navigation may be unstable.
Further, the accelerometer readings may be damped.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • User Interface Gestures For Moving a Virtual Camera On A Mobile Device
  • User Interface Gestures For Moving a Virtual Camera On A Mobile Device
  • User Interface Gestures For Moving a Virtual Camera On A Mobile Device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034]Embodiments of the present invention provide for navigation in a three dimensional environment on a mobile device. In the detailed description of embodiments that follows, references to “one embodiment”, “an embodiment”, “an example embodiment”, etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.

[0035]This detailed description is divided into sections. The first section provides an introduction to navigation through three dimensional env...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This invention relates to user interface gestures for moving a virtual camera on a mobile device. In an embodiment, a computer-implemented method navigates a virtual camera in a three dimensional environment on a mobile device having a touch screen. A user input is received indicating that two objects have touched a view of the mobile device and the two objects have moved relative to each other. A speed of the objects is determined based on the user input. A speed of the virtual camera is determined based on the speed of the objects. The virtual camera is moved relative to the three dimensional environment according to the speed of the virtual camera.

Description

[0001]This application claims the benefit of U.S. Provisional Pat. Appl. No. 61 / 091,234, filed Aug. 22, 2008, which is incorporated by reference herein in its entirety.BACKGROUND[0002]1. Field of the Invention[0003]This invention generally relates to navigation in a three dimensional environment.[0004]2. Background Art[0005]Systems exist for navigating through a three dimensional environment to display three dimensional data. The three dimensional environment includes a virtual camera that defines what three dimensional data to display. The virtual camera has a perspective according to its position and orientation. By changing the perspective of the virtual camera, a user can navigate through the three dimensional environment.[0006]Mobile devices, such as cell phones, personal digital assistants (PDAs), portable navigation devices (PNDs) and handheld game consoles, are being made with improved computing capabilities. Many mobile devices can access one or more networks, such as the I...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(United States)
IPC IPC(8): G09G5/00G06F3/041
CPCG06F1/1626G06F3/04815G06T19/003G06F3/04883G06F2200/1637G06F3/0488G06F3/017G06F3/0485G06T2200/24
Inventor KORNMANN, DAVIDBIRCH, PETERMORTON, MICHAEL
Owner GOOGLE LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products