Real 3D virtual simulation interaction method and system

A technology of three-dimensional virtual and interactive methods, applied in the field of true three-dimensional virtual simulation interactive methods and systems, can solve problems such as single expression of knowledge points, inability to achieve a situational reading and learning experience, and heavy performance by lecturers.

Active Publication Date: 2018-11-16
辽宁向日葵教育科技有限公司
View PDF9 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the online courses of MOOCs are realized by special effects and post-synthesis in the existing virtual studio in the market, so as to achieve interaction with 3D virtual elements, and cannot control 3D elements in real time and perform timely virtual simulation interaction
Moreover, in the explanation of some knowledge points in the MOOC, the expression form of the knowledge points is single, and it is impossible to realize the situational, dynamic and visual reading and learning experience.
At the same time, the existing form is still based on traditional technology to make MOOCs in the form of classroom dubbing.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Real 3D virtual simulation interaction method and system
  • Real 3D virtual simulation interaction method and system
  • Real 3D virtual simulation interaction method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0062] A kind of real three-dimensional virtual simulation interactive method, see figure 1 , including the following steps:

[0063] S1: Obtain multiple input video sources and / or interactive content;

[0064] S2: Obtain the superimposition method input by the user;

[0065] Specifically, the overlay mode is used to indicate the video source and / or interactive content that needs to be overlaid, and the corresponding layer.

[0066] S3: Synthesize the video source and / or interactive content in layers according to the superimposition method to obtain a composite video;

[0067]Specifically, for example, the data to be synthesized includes video source A, video source B, interactive content A, and interactive content B. When the superimposition method input by the user is to superimpose interactive content A and video source A, and interactive content A is superimposed on video source A, then when superimposing, video source A is used as the bottom layer, and interactive cont...

Embodiment 2

[0074] Embodiment 2 On the basis of Embodiment 1, the following interactive functions through the sonar pen are added.

[0075] The output device includes a writing screen; after the method realizes switching between synthesized videos on the output device, refer to figure 2 ,Also includes:

[0076] S11: Capture the real-time position of the sonar pen on the writing screen;

[0077] Specifically, the sonar pen is an existing mature product.

[0078] S12: establishing the association between the real-time position of the sonar pen and the real-time position of the mouse;

[0079] Specifically, after the association is established, the user can directly simulate mouse actions by operating the sonar pen, such as dragging, clicking, and double-clicking.

[0080] S13: Obtain a control command obtained by the user operating the sonar pen;

[0081] S14: simulating mouse operation according to the control instruction to realize dragging, rotating, scaling or splitting the synthes...

Embodiment 3

[0087] Embodiment 3 On the basis of the foregoing embodiments, the following content is added:

[0088] The output device also includes a projector and a video camera;

[0089] The writing screen is fixed in front of the wall; the projector is set between the writing screen and the wall by rear projection; the camera is set in front of the writing screen, and the lens of the camera is set facing the writing screen.

[0090] Specifically, the writing screen may be a green screen. The distance between the writing screen and the wall is preferably 1.1m, the distance between the projector and the writing screen is preferably 0.3m, and the distance between the lens aperture of the camera and the writing screen is preferably 3.3m. The projector is installed by means of rear projection, which is suitable for environments with few audiences and good ambient light and lighting.

[0091] Further, see image 3 , the transmission of the synthesized video to the output device output spe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a real 3D virtual simulation interaction method and system. The method comprises the following steps that multiple input video sources and/or interaction contents are obtained;a superposition mode input by a user is obtained; the video sources and/or interaction contents are synthesized according to the superposition mode to obtain a synthesis video; the synthesis video istransmitted to an output device and output; and a switching instruction input by the user is obtained, and switching among synthesis videos is realized in the output device. Via the method, a recording staff can make real-time interaction in the video sources or interaction contents cooperated with explained knowledge points, and the recording staff can explain an abstract concept and a practice process more concretely due to cooperation of the video sources or interaction contents. The output device is fused and analyzed and matched accurately via the multichannel 3D video, virtual and real interactive simulated multipath output is carried out, a practical video stream is collected and input to a real 3D virtual scene, and a teacher makes interaction with the real 3D virtual scene in realtime.

Description

technical field [0001] The invention belongs to the technical field of education guidance, and in particular relates to a real three-dimensional virtual simulation interaction method and system. Background technique [0002] With the development of the Internet, many existing educational institutions prefer distance education or online education. This is also due to the convenience of distance education and online education. Users do not need to go to designated training places, and can directly receive different education online. So it is also loved by users. [0003] For example, MOOC is a learning platform based on online education, which has multiple courses to help users conduct distance education. At present, the online courses of MOOCs are all implemented by the existing virtual studio in the market with special effects and post-synthesis to achieve interaction with 3D virtual elements. It is impossible to control 3D elements in real time and conduct timely virtual s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G09B5/14G06T13/20
CPCG06T13/20G09B5/14
Inventor 李宏伟
Owner 辽宁向日葵教育科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products