Virtual Fixtures for Improved Performance in Human/Autonomous Manipulation Tasks

a technology of virtual fixtures and tasks, applied in the field of virtual fixtures, can solve the problems of inability of the hip to penetrate the virtual environment objects, the penetration rate is high, and the direct method suffers, so as to save equipment, save time, and save tim

Inactive Publication Date: 2014-10-30
UNIV OF WASHINGTON CENT FOR COMMERICIALIZATION
View PDF2 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0023]Another advantage of this application is that haptic feedback is generated at rapid rates based on depth data, such as a stream of frames with depth information. As disclosed herein, the use of a stream of frames with depth information permits haptic rendering, or providing haptic feedback, at a haptic rendering rate faster than a frame-reception rate. In some embodiments, a herein-described haptic rendering system can have a haptic rendering rate of at least 1000 Hz.
[0024]Another advantage of this application is that virtual fixtures can be defined based on objects recognized in the environment. That is, as an object changes within the environment, a virtual fixture associated with the object can dynamically adjust to the changes of the object. Another advantage is that virtual fixtures can be dynamically changed based on status of operations, such tasks listed and organized on a task list. One or more virtual fixtures can be associated with task(s) on the task list. Virtual fixtures associated with a task can change based on the status of the task. Thus, the virtual fixtures can change throughout execution of a task, and so guide completion of the task. Further, a level of automation can be specified that controls the virtual fixtures, and so allows for full automation, partial automation, or no automation (manual control) for completing the task.
[0025]Another advantage of this application is providing a camera that captures images and depth information underwater. The images can be captured in one range of frequencies, such as within a blue-green range of visible light frequencies. The depth information can be captured in a different range of frequencies, such as in a near-infrared range of frequencies. The visible light can be captured and a visible light image generated. At the same time, NIR radiation can be captured and an image with depth information generated. The visible light image and the image with depth information can be sent from the camera in succession, so that both visible-light information and depth information for a same time can be provided. The visible-light information and depth information can be used to generate haptic feedback for operators controlling devices to perform underwater tasks.
[0026]Providing haptic feedback from underwater sensors to an operator at the surface has the potential to transform subsea manipulation capability. Virtual fixtures will allow manipulators to precisely maneuver in sensitive environments where contact should not be made with surrounding objects or structures. Biological studies of animal colonies around hydrothermal vents could benefit greatly from such a capability—allowing scientists to carefully gather data with unprecedented resolution and proximity to sensitive organisms. Common tasks such as connector mating between instruments can be carried out very efficiently by creating guidance fixture near male and female connectors. Thus, time, expense, and equipment can be saved, and the environment preserved using the herein-described haptic feedback techniques to perform underwater tasks.

Problems solved by technology

Ideally, the HIP should not be able to penetrate virtual environment objects.
Unfortunately the direct method suffers from problems with “pop through”.
Pop through is an artifact that arises when the rendering algorithm erroneously penetrates a thin surface.
This method results in a virtual tool that actually penetrates objects in the environment.
Fortunately this penetration is typically very small.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual Fixtures for Improved Performance in Human/Autonomous Manipulation Tasks
  • Virtual Fixtures for Improved Performance in Human/Autonomous Manipulation Tasks
  • Virtual Fixtures for Improved Performance in Human/Autonomous Manipulation Tasks

Examples

Experimental program
Comparison scheme
Effect test

example implementation

[0139

[0140]As an example, the herein-described 6-DOF haptic rendering algorithm was implemented on a desktop computer (AMD Phantom II λ6 with a Radeon HD 6990 GPU) running Ubuntu 11.10. The force was calculated asynchronously at 1000 Hz in a separate thread. During typical interaction, the collision detection algorithm ran at 15 kHz. Point images were filtered and normal vectors were calculated for every point using the GPU.

[0141]Realtime processing was achieved using a neighborhood of 9×9 points for filtering as well as normal vector calculation. The position of the haptic rendering device was both controlled automatically (for purposes of producing accurate results) as well as with a Phantom Omni haptic rendering device. Using the latter, only translational forces could be perceived by the user since the Phantom Omni only provides 3 DOFs of sensation.

[0142]To evaluate the presented haptic rendering method in a noise free environment, a virtual box (with 5 sides but no top) was con...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

Apparatus and method for defining and utilizing virtual fixtures in haptic rendering sessions interacting with various environments, including underwater environments, are provided. A computing device can receive first depth data about an environment. The computing device can generate a first plurality of points from the first depth data. The computing device can determine a haptic interface point (HIP) and can define a virtual fixture for the environment. The computing device can determine a first force vector between the HIP and the first plurality of points using the computing device, where the first force vector is based on the virtual fixture. The computing device can send a first indication of haptic feedback based on the first force vector.

Description

RELATED APPLICATIONS[0001]The present application claims priority to U.S. Provisional Patent Application No. 61 / 756,132 entitled “Methods and Systems for Six Degree-of-Freedom Haptic Interaction with Streaming Point Clouds”, filed Jan. 24, 2013, U.S. Provisional Patent Application No. 61 / 764,908 entitled “Methods for Underwater Haptic Rendering Using Nontact Sensors”, filed Feb. 14, 2013, and U.S. Provisional Patent Application No. 61 / 764,921 entitled “Virtual Fixtures for Subsea Technology”, filed Feb. 14, 2013, all of which are entirely incorporated by reference herein for all purposes.STATEMENT OF GOVERNMENT RIGHTS[0002]This invention was made with government support under grant no. 0930930, awarded by the National Science Foundation and with support under grant MRSEED01-006 “Haptically-Enabled Co-Robotics for Remediation of Military Munitions Underwater” for the Strategic Environmental Research and Development Program (SERDP). The United States Government has certain rights in t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(United States)
IPC IPC(8): G06F3/01
CPCG06F3/016A61B34/25H04N23/56G06T15/04
Inventor CHIZECK, HOWARD JAYRYDEN, FREDRIKSTEWART, ANDREW
Owner UNIV OF WASHINGTON CENT FOR COMMERICIALIZATION
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products