Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine interaction virtual-real fusion method and device based on mixed reality head-mounted display

A technology of human-computer interaction and mixed reality, applied in the input/output of user/computer interaction, mechanical mode conversion, computer components, etc., can solve the problems of reduced authenticity of interaction effects and poor guarantee of modeling accuracy, etc.

Pending Publication Date: 2022-08-09
BEIJING AERONAUTIC SCI & TECH RES INST OF COMAC +1
View PDF1 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

As the existing modeling technology cannot guarantee the modeling accuracy well, there are some differences between the virtual human model and the tester in terms of skin color, clothing, bone length, etc., and the authenticity of the interaction effect is reduced.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction virtual-real fusion method and device based on mixed reality head-mounted display
  • Man-machine interaction virtual-real fusion method and device based on mixed reality head-mounted display
  • Man-machine interaction virtual-real fusion method and device based on mixed reality head-mounted display

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0036] Figure 7 It is a flow chart of a virtual-real fusion method for human-computer interaction based on a mixed reality head-mounted display according to an embodiment of the present invention, such as Figure 7 As shown, the method includes the following steps:

[0037] Step S702, obtain the internal parameter matrix M of the camera 内 and the extrinsic parameter matrix M 外 .

[0038] Step S704, obtain the tracking coordinate system related information of the MR device according to the internal parameter matrix and the external parameter matrix {P ori , R ori } and the pose relationship t between the entity camera relative to the tracking origin related .

[0039] Step S706, using the MR device to track the pose relationship of the camera on the device and the head-mounted display relative to the tracking origin, using the formula M' 外 =T(P ori ,R ori ,t related ) to obtain the extrinsic parameters M' of the camera 外 , where T(*) means that through P ori ,R or...

Embodiment 2

[0069] Figure 8 It is a structural block diagram of a virtual reality fusion device for human-computer interaction based on a mixed reality head-mounted display according to an embodiment of the present invention, such as Figure 8 As shown, the device includes:

[0070] The acquisition module 80 is used to acquire the internal parameter matrix M of the camera 内 and the extrinsic parameter matrix M 外 .

[0071] The coordinate module 82 is used to obtain the tracking coordinate system related information of the MR device according to the internal parameter matrix and the external parameter matrix {P ori , R ori } and the pose relationship t between the entity camera relative to the tracking origin related .

[0072] The calculation module 84 is used to use the MR device to track the pose relationship of the camera on the device and the head-mounted display relative to the tracking origin, using the formula M' 外 =T(P ori ,R ori ,t related ) to obtain the extrinsic par...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human-computer interaction virtual-real fusion method and device based on a mixed reality head-mounted display. The method comprises the following steps: acquiring an internal parameter matrix M < internal > and an external parameter matrix M < external > of a camera; and according to the internal parameter matrix and the external parameter matrix, obtaining tracking coordinate system related information {Pori, Roi} of the MR equipment and a pose relationship between the entity camera and a tracking origin, obtaining an external parameter M'of the camera, and according to the internal parameter and the external parameter of the camera, obtaining a fused image. The method solves the problem that in the prior art, in order to achieve interactivity under the two conditions, a virtual human model of which the clothes, the skin color, the skeleton length and the like are similar to those of a testee is generally constructed in a virtual scene by means of a modeling technology, and the virtual human model and the testee move synchronously by means of data driving. Due to the fact that an existing modeling technology cannot well guarantee the modeling precision, a virtual human model has a certain difference with a tester in the aspects of skin color, clothing, skeleton length and the like, and the authenticity of the interaction effect is reduced.

Description

technical field [0001] The invention proposes a human-computer interaction method for virtual and real dynamic fusion suitable for mixed reality simulation of civil aircraft. The real scene picture is obtained through the camera of the mixed reality head display, the three-dimensional position information of the joint points of the human body is obtained by using a multi-camera system, and then the joint The three-dimensional position information of the point and the image processing technology identify the pixel area of ​​the limb in the picture, and finally extract the pixel area and merge it into the virtual display scene to realize a more realistic and accurate human-computer interaction virtual-real fusion method, so the present invention belongs to Mixed reality, human-computer interaction technology field. Background technique [0002] With the continuous development of intelligent technology, more and more intelligent equipment is used in people's life, work and stud...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/80G06T7/20G06T5/50G06F3/01G06T19/00G06V10/62G06V10/74G06V40/10
CPCG06T7/80G06T5/50G06F3/011G06T7/20G06V10/74G06V10/62G06T19/006G06V40/10G06T2207/20221G06T2207/10024
Inventor 成天壮杨东浩吴程程许澍虹王大伟杨志刚
Owner BEIJING AERONAUTIC SCI & TECH RES INST OF COMAC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products