Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Haptic rendering method based on texture image

A technology of tactile reproduction and texture image, applied in the field of image processing, it can solve the problems of inability to simulate texture contact, limited texture types, lack of realism, etc.

Active Publication Date: 2014-06-18
SOUTHEAST UNIV
View PDF3 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This method can better reflect the characteristics of the texture to a certain extent, but the types of textures that can be expressed by the deterministic model are limited, and it can only make the force feel clearly distinguishable, but it lacks a sense of reality and cannot simulate real texture contact.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Haptic rendering method based on texture image
  • Haptic rendering method based on texture image
  • Haptic rendering method based on texture image

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0047] The technical solution of the present invention will be specifically described below in conjunction with the accompanying drawings.

[0048] Such as figure 1 with figure 2 As shown, a kind of haptic reproduction method based on texture image of the present invention, the haptic reproduction method includes the following process:

[0049] Step 10) Extracting texture features, the texture features include the microscopic height of the texture surface and the dynamic friction coefficient representing the texture roughness;

[0050] The microscopic height of the textured surface is extracted using a shape recovery method from shading. The specific process is as follows: first, according to the method of recovering shape from light and shade, assuming that the light source is a point light source at infinity, the reflection model is a Lambertian reflection model, and the imaging geometric relationship is an orthogonal projection, the texture shown in formula (1) is establ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a haptic rendering method based on a texture image. The haptic rendering method comprises the following steps that (10) texture features are extracted, wherein the texture features comprises the texture surface microscopic height and the dynamic friction coefficient for rendering texture roughness; (20) on the basis of the texture features extracted in the step (10), haptic modeling is performed, and the resultant force of texture force is measured and calculated; (30) the resultant force of texture force measured and calculated in the step (20) is output and fed back to an operator through a hand controller. According to the haptic rendering method based on the texture image, the dynamic friction coefficient of virtual texture can be obtained with no need for specific measurement equipment, and the roughness of the virtual texture can be passed to humans without a special texture expression device.

Description

technical field [0001] The invention belongs to the technical field of image processing, and in particular relates to a force-tactile reproduction method based on texture pictures. Background technique [0002] Force-tactile reproduction is an important part of human-computer interaction. The information that force-tactile reproduction can express includes surface shape, texture, roughness, temperature and other characteristic information that people feel when they are in contact with environmental objects. The realization of haptic reproduction generally includes three steps: the extraction of physical feature information of objects, the haptic modeling of feature information, and the rendering expression of haptic stimulation. [0003] In virtual reality, the surface microscopic height and roughness of textures are important features in haptic modeling. Methods for obtaining the surface microscopic height of textured surfaces are mainly divided into two categories: one is...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06T7/40
Inventor 吴涓丁彧宋爱国刘威宋光明王茜
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products