Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A vibration and tactile encoding and decoding method based on deep learning

An encoding and decoding method and deep learning technology are applied in the field of vibrotactile encoding and decoding based on deep learning, which can solve the problem that the tactile information does not reach the high-quality level, improve the quality of reconstructed signals, meet real-time encoding requirements, and achieve nonlinear mapping. powerful effect

Active Publication Date: 2022-04-12
FUZHOU UNIV
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] Current research on tactile information has not reached the same high-quality level as auditory information

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A vibration and tactile encoding and decoding method based on deep learning
  • A vibration and tactile encoding and decoding method based on deep learning
  • A vibration and tactile encoding and decoding method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0027] It should be pointed out that the following detailed description is exemplary and is intended to provide further explanation to the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.

[0028] It should be noted that the terminology used here is only for describing specific implementations, and is not intended to limit the exemplary implementations according to the present application. As used herein, unless the context clearly dictates otherwise, the singular is intended to include the plural, and it should also be understood that when the terms "comprising" and / or "comprising" are used in this specification, they mean There are features, steps, operations, means, components and / or combina...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention relates to a vibration tactile encoding and decoding method based on deep learning, which specifically includes the following steps: jointly encoding different dimensional data of the tactile signal, removing the redundancy existing in the three-dimensional space of the tactile data, and preprocessing it at the same time; The control cycle unit network GRU is trained, and each time two sets of data are input to obtain the predicted data of the next set of data, and the real value of the next set of data is used as a label, and the predicted data is compared with the real data to calculate the residual pair prediction The data is compensated to obtain the reconstructed forecast data; the reconstructed forecast data is packaged with the previous set of data and used as the input data for the next round of forecast. Compared with the prior art, the present invention has greatly improved performance.

Description

technical field [0001] The present invention relates to the technical field of video coding, in particular to a vibration-tactile coding and decoding method based on deep learning. Background technique [0002] Current research on tactile information has not reached the same high-quality level as auditory information. In particular, compared with audio and video, the development technology of tactile codec still has a considerable distance, and further research is needed in design and optimization. Therefore, it is very necessary and urgent to design a haptic codec that can significantly reduce the data rate and achieve high fidelity and low latency at the same time. [0003] At present, the tactile codec design scheme is mainly divided into two categories. The first category is the tactile coding and decoding algorithm based on the human tactile perception physiological theory. Compression algorithm to the frequency domain. Contents of the invention [0004] In view of...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01G06N3/04
CPCG06F3/016G06N3/04
Inventor 赵铁松王楷徐艺文房颖冯伟泽郑权斐
Owner FUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products