Dynamic video tactile feature extraction and rendering method

A feature extraction and video technology, applied in the field of virtual reality and human-computer interaction, can solve the problems of low resolution, no consideration of inter-frame dynamic features, and only consider intra-frame features, etc., to increase feature information and excellent user-friendliness Sexuality, the effect of rich dimensions

Active Publication Date: 2020-06-16
JILIN UNIV
View PDF17 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In 2014, Kim et al. proposed a vibration-based saliency-driven video for 4D movies: tactile conversion, but the tactile feedback generated by vibration has the disadvantage of low resolution.
but it doesn't take into account the dynamic nature of the video
[0006] The Chinese patent "A Video Chatting Method and Terminal Integrating Tactile Sensing Functions" (publication number CN104717449A) discloses a method of tactile feedback based on real-time video communication for mobile terminals. Similarly, it only considers spatial intra-frame features, Dynamic features between frames are not taken into account

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dynamic video tactile feature extraction and rendering method
  • Dynamic video tactile feature extraction and rendering method
  • Dynamic video tactile feature extraction and rendering method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0056] see figure 2 , including the following steps:

[0057] (1) decompressing and processing the received video;

[0058] (2) Video preprocessing, based on the color histogram feature segmentation shot between frames;

[0059] (1) First convert the RGB space to the HSI space to obtain the hue H (hue), saturation S (saturation) and brightness I (itensity) of each pixel in the image;

[0060]

[0061] here

[0062]

[0063]

[0064] (2) Then it is non-uniformly quantized according to human color perception, the hue H space is divided into 8 parts, the saturation S space is divided into 3 parts, the brightness I space is divided into 3 parts, and the entire HSI color space is divided into 72 parts. subspace (8×3×3), assign different weights to the three color components of HSV according to the sensitivity of human vision, and then use the following formula to synthesize a one-dimensional feature vector:

[0065] L=9H+3S+V

[0066] (3) count the number of pixels...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to a dynamic video tactile feature extraction and rendering method, and belongs to the field of virtual reality and human-computer interaction. The method comprises the steps ofdecompressing a received video, preprocessing the video, segmenting shots based on inter-frame color histogram features, extracting saliency maps fused with time-space domain tactile saliency featuresfrom all the segmented frames in each shot, and performing pixel-level tactile rendering according to the saliency maps of the video frames. According to the method, the video content is divided intothe salient region and the non-salient region by extracting the video frame salient features fused with the space-time characteristics. And pixel-level tactile stimulation is applied to the video frame by adopting a one-to-one mapping relationship between the visual channels and the tactile channels. And real-time tactile feedback is generated through the terminal, so that the realistic experience of watching the video by the user is enriched. The method can be widely applied to video education, multimedia entertainment and human-computer interaction.

Description

technical field [0001] The invention belongs to the field of virtual reality and human-computer interaction, and in particular relates to a method for dynamic video feature extraction and tactile reproduction. Background technique [0002] The development of 4G technology has made the application of video streaming grow exponentially. The advent of the 5G era will support the construction of another important human body sensory channel on the basis of the original audio and video channels: the tactile channel, making the performance and experience of human-computer interaction be enriched. Therefore, it will become an important technical difficulty to realize feature extraction and tactile rendering of video media. [0003] Existing feature extraction methods for tactile rendering usually use static images as the extraction objects, while videos are composed of image sequences, adding a dimension of time to the original two-dimensional space of images. Due to the temporal ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/46G06K9/62G06T7/215G06T7/246G06T7/269G06T15/00
CPCG06T7/215G06T7/246G06T7/269G06T15/005G06T2207/10016G06V20/49G06V20/46G06V10/56G06V10/462G06F18/253
Inventor 孙晓颖韩宇刘国红赵越宋瑞
Owner JILIN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products