Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Haptic representation sense of reality objective evaluation method based on human haptic perception feature

An objective evaluation method and tactile reproduction technology, applied in the fields of instruments, character and pattern recognition, computer parts, etc., can solve problems such as poor consistency of general evaluation results, eliminate the influence of environmental and human subjective factors, save time, Efficient effect

Active Publication Date: 2017-02-15
SOUTHEAST UNIV
View PDF6 Cites 13 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In order to overcome the shortcomings of the generality and poor consistency of evaluation results of various force-tactile reproduction realism evaluation methods

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Haptic representation sense of reality objective evaluation method based on human haptic perception feature
  • Haptic representation sense of reality objective evaluation method based on human haptic perception feature
  • Haptic representation sense of reality objective evaluation method based on human haptic perception feature

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] Embodiments of the present invention will be specifically described below in conjunction with the accompanying drawings.

[0028] An objective evaluation method for the realistic sense of force-tactile reproduction based on the characteristics of human tactile perception proposed by the present invention is a simulation of the task process of human subjective perception and evaluation of force-tactile reproduction, as shown in the attached figure 1 As shown, both subjective and objective force tactile reproduction realistic evaluations include four basic steps of perception, conduction, classification, and evaluation, in which tactile receptors and physical sensors belong to the perception process, nerve conduction and data preprocessing belong to the conduction process, and the nerve center (brain ) and tactile perception model processing belong to the classification process, and the final result belongs to the evaluation process. Specific to the objective evaluation m...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a haptic representation sense of reality objective evaluation method based on a human haptic perception feature. The method simulates a sense of reality evaluation process subjectively carried out by a person. The method is characterized in that first of all, data acquisition is performed on the same haptic representation task in a realistic environment and a virtual environment, a multidimensional matrix, i.e., a "haptic image", is generated through analysis, through modeling based on the human haptic perception feature, perception filtering is performed on the haptic image, data is mapped to a perception space, similarity analysis is performed on the data in the realistic environment and the virtual environment in the perception space, and thus objective evaluation of haptic sense of reality representation is completed. According to the invention, the method has the following advantages: influences exerted by human subjective factors in an evaluation process are effectively avoided, and the stability and the repeatability are high; compared to a conventional subjective evaluation experiment requiring many people and long time, the efficiency is high, and time is saved; the human haptic perception feature is taken into full consideration, and the process is closer to a real evaluation process; and the method is suitable for different haptic representation devices and tasks.

Description

technical field [0001] The invention belongs to the field of force-tactile interaction, and in particular relates to an objective evaluation method for realistic sense of force-tactile reproduction based on the characteristics of human tactile perception. Background technique [0002] Virtual haptic reproduction is to use certain devices to simulate various mechanical stimuli in the interaction between people and the environment in the real environment and act on people, thereby generating a haptic experience process similar to the real environment. Force-tactile reproduction not only reflects the objective physical properties of objects in the environment, but also has a close relationship with human tactile perception characteristics. It is a vivid imitation and description of the similarity of objectively existing objects. [0003] Realistic haptic reproduction can improve the effectiveness of human-computer interaction and expand the practicality of virtual reality techn...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06K9/40
CPCG06V10/30G06F18/2193G06F18/22
Inventor 吴涓邵知宇吴淼龚毅宋爱国
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products