Virtual experiment system and method based on multi-modal interaction

A virtual experiment and multi-modal technology, applied in the field of virtual experiment, can solve the problem of low efficiency of virtual-real interaction

Active Publication Date: 2020-09-11
UNIV OF JINAN
View PDF5 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] This application provides a virtual experiment system and method based on multi-modal interaction to solve the problem of low virtual-real interaction efficiency caused by virtual experiment methods in the prior art

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Virtual experiment system and method based on multi-modal interaction
  • Virtual experiment system and method based on multi-modal interaction
  • Virtual experiment system and method based on multi-modal interaction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0088] see figure 1 , figure 1 It is a schematic structural diagram of a virtual experiment system based on multi-modal interaction provided by the embodiment of the present application. Depend on figure 1 It can be seen that the virtual experiment system based on multimodal interaction in this embodiment mainly includes: an input layer, a perception and recognition layer, a fusion layer, and an application layer.

[0089] Among them, the input layer is used to collect the depth information of human skeletal nodes through the visual channel, the sensing signal through the tactile channel and the voice signal through the auditory channel. The depth information of the human skeletal nodes includes: the coordinates of the joint points of the human hand, and the sensing signals include: Magnetic signal, photosensitive signal, touch signal and vibration signal. The perception and identification layer is used to identify the information of the visual channel and the auditory chan...

Embodiment 2

[0128] exist Figure 1-Figure 6 On the basis of the illustrated embodiment see Figure 7 , Figure 7 It is a schematic flowchart of a virtual experiment method based on multi-modal interaction provided by the embodiment of the present application. Depend on Figure 7 As can be seen, the virtual experiment method in this embodiment mainly includes the following processes:

[0129] S1 collects corresponding visual information, sensing signals and voice signals respectively through the visual channel, tactile channel and auditory channel. Sensing signals include: magnetic signals, photosensitive signals, touch signals and vibration signals.

[0130] S2: Identify the information of visual channel, tactile channel and auditory channel respectively.

[0131] Among them, the information recognition method of the visual channel mainly includes the following process:

[0132] S201: Build an AR environment.

[0133] S202: Train a gesture recognition model in a convolutional neura...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a virtual experiment system and method based on multi-modal interaction. The experiment system mainly comprises an input layer, a sensing and recognition layer, a fusion layerand an application layer. The method comprises the steps: acquiring corresponding visual information, sensing signals and voice signals through a visual channel, a tactile channel and an auditory channel respectively; respectively identifying the information of different channels; fusing corresponding vector sets constructed by modal information of the visual channel, the tactile channel and the auditory channel on the AR platform by adopting a multi-modal fusion method based on a decision level according to an identification result; and according to the fusion result, presenting an experimentprocess and an experiment result by adopting voice navigation, visual display and tactile feedback modes. Through the application, multiple channels can be fully utilized, and an experimental processis achieved by adopting a multi-mode fusion method, so that the operation load of a user is reduced, the immersion of an experiment is improved, and the efficiency of virtual interaction is improved.

Description

technical field [0001] The present application relates to the field of virtual experiment technology, in particular to a virtual experiment system and method based on multimodal interaction. Background technique [0002] With the development of human-computer interaction technology, the method of using augmented reality technology to present virtual experiments has become more and more widely used in the field of teaching and education, especially in the field of chemical education where there are chemical dangerous goods and dangerous experimental phenomena, the application of virtual experiments is more and more important. urgent. How to design virtual experiment methods and experimental systems, so as to avoid the risk of students operating experiments and improve students' interest in learning knowledge, is an important issue in the design of virtual experiments. [0003] The current virtual experiment method usually uses augmented reality technology to complete the ren...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06F3/011G06F3/016G06F3/017G06N3/08G06V40/28G06N3/045G06F18/24
Inventor 冯志全肖梦婷
Owner UNIV OF JINAN
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products