Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

VR-based spatial cognitive ability training method and system

A technology of spatial cognition and training method, which is applied in the field of VR-based spatial cognition ability training and system, can solve the problem that users can only carry out real-scene training of spatial cognition ability, so as to reduce the difficulty of training, improve the training effect, and ensure safe effect

Inactive Publication Date: 2020-01-10
NORTHEASTERN UNIV +1
View PDF6 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to solve the problem that users can only train spatial cognitive ability in real scenes, the present invention provides a VR-based spatial cognitive ability training method and system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • VR-based spatial cognitive ability training method and system
  • VR-based spatial cognitive ability training method and system
  • VR-based spatial cognitive ability training method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0041] Suppose it is determined in Example 1 figure 2 Path information training in , combined with Figure 1-2 for a detailed description.

[0042] The spatial cognition training system includes: a VR device and an operating handle; the operating handle is a handle that sends an operating signal to the VR device; a plurality of training scenes for training users are pre-built in the VR device, and each training scene has at least Three training levels; the training methods include:

[0043] S1. For the user to be trained, after receiving the scene trigger instruction, display the virtual reality scene to the user to be trained;

[0044] S2. After receiving the training level instruction of the displayed virtual reality scene, determine the following in the displayed virtual reality scene figure 2 The path information shown;

[0045] S3. Receive the instruction to start the training triggered by the user to be trained, and display the transparent current user character wi...

Embodiment 2

[0054] Assuming that it is determined in Example 2 figure 2 Path information training in , combined with Figure 1-3 for a detailed description.

[0055] S1. For the user to be trained, after receiving the scene trigger instruction, display the virtual reality scene to the user to be trained;

[0056] S2. After receiving the training level instruction of the displayed virtual reality scene, determine the following in the displayed virtual reality scene figure 2 The path information shown;

[0057] Among them, for figure 2 The path information includes: a storage of all reachable intersection indexes from the starting point D0 to the end point D3 in the current virtual scene, including the first array of the intersection index of the correct route and the wrong intersection index, and the intersection index of the correct route They are stored in order and numbered sequentially starting from 0, among which, figure 2 The index number of the starting point D0 in the first ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a spatial cognitive ability training method and system based on VR equipment, and the training method comprises the steps: A1, receiving a scene triggering instruction, and displaying a virtual reality scene; A2, receiving a training level instruction, and determining corresponding path information; A3, receiving a training starting instruction, displaying a current user role in the virtual reality scene, and automatically walking according to the path information; A4, when the walking direction needs to be selected at the intersection of the path information, sending an operation prompt tone to the user; A5, receiving direction information triggered by a user, judging whether the direction information is correct or not, and if not, sending out a selection error prompt tone; and A6, receiving direction information triggered by the to-be-trained user again, judging whether the direction information is correct or not until the triggered direction information is correct, and enabling a current user role displayed in the virtual reality scene to automatically walk according to the path information until a destination is reached. According to the invention, the user can train the spatial cognitive ability in the virtual reality scene.

Description

technical field [0001] The invention relates to a VR-based spatial cognitive ability training method and system. Background technique [0002] At present, mild cognitive impairment is a chronic degenerative disease of the nervous system, which refers to a state of cognitive impairment between normal aging and dementia. meet the criteria for Alzheimer's disease. However, patients diagnosed with mild cognitive impairment have an extremely high risk of developing Alzheimer's disease, with a conversion rate of 6% to 25% per year. The field has gone more than a decade without a breakthrough drug, as Alzheimer's disease drug development trials fail one after another. Existing drugs can only delay the progression of the disease. In this case, it is very necessary to improve spatial cognition and prevent the occurrence of senile dementia. [0003] The current method for improving spatial cognition is mainly the real scene training of the staff, but the user is inconvenient to mo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G09B9/00
CPCG06F3/011G06F2203/012G09B9/00
Inventor 覃文军林国丛刘春燕王玉平陈超徐哲学韩涛王同亮杨金柱栗伟曹鹏冯朝路赵大哲
Owner NORTHEASTERN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products