Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-interaction method and device based on kinect and unity3d

An interactive method and equation technology, applied in the input/output of user/computer interaction, the input/output process of data processing, instruments, etc., can solve the problems of cumbersome 3D registration algorithm, many restrictions, and few interactive methods, etc., to achieve triggering The registration mechanism is flexible, simplifies displacement changes and enriches the effect of interactive methods

Active Publication Date: 2017-10-10
BEIJING RES CENT OF INTELLIGENT EQUIP FOR AGRI +1
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] However, the current commonly used technologies have the following defects: 1) The interaction method is simplistic, and the registration of the virtual model can only be triggered by the calibration image or text, and after registration, the model can only be translated and rotated, and the model can only follow the movement of the calibration object , less interaction methods and more restrictions; 2) The 3D registration algorithm is cumbersome, it is necessary to determine the position and posture of the model according to the feature point coordinate system, and then convert it to the camera coordinate system, and finally integrate the virtual model and the real scene and align it according to the coordinates of the display screen. it displays
It can be seen that the current technology requires more calculation steps in the 3D registration stage of the virtual model, and the operation is not simple and automatic enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-interaction method and device based on kinect and unity3d
  • Multi-interaction method and device based on kinect and unity3d

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.

[0036] Such as figure 1 As shown, the present invention provides a kind of multi-interaction method based on Kinect and Unity3D, comprises specific following steps:

[0037] Step S1: Adjust the camera parameters in Unity3D to be consistent with the effective detection range of Kinect. Specifically, place the Kinect to the preset position of the real scene, and adjust the real scene to be within the effective detection range of the Kinect, wherein the effective range refers to 1.2-3.6 meters from the camera, 57 degrees horizontally, and 43 degrees vertically.

[0038] Furthermore, in the coordinate system of the data returned by Kinect, the origin is the Kinect sensor, so the camera in Unity3D is placed at the coordinate origin to facilitate the calculation of virtual model coordinates during 3D registration. Adjust the Field of view and Clipping Planes...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a multi-interactive method based on Kinect and Unity 3D. The method comprises the following steps: S1: adjusting the camera parameter in the Unity 3D to be in accordance with the effective detection range of the Kinect; S2: utilizing the Kinect to determine a user coordinate and a ground equation; S3: determining a virtual simulate coordinate according to the relative position, and registering a virtual model; S4: designing interactive pose and voice; S5: determining Unity 3D control model displacement animation and a multimedia effect; S6: fusing and displaying an image obtained by a camera of the Unity 3D and an image obtained by a camera of the Kinect. The multi-interactive method utilizes support on voice recognition and positioning on human skeleton by the Kinect to add the trigger mode of the virtual model three-dimensional registration, provides more interactive modes for users through the recognition function of limb motion, improves the user experience, utilizes a three-dimensional engine of the Unity 3D to conduct automatic processing on the model pose, and greatly simplifies the needed step of the three-dimensional registration. The invention further discloses a multi-interactive device based on the Kinect and the Unity 3D.

Description

technical field [0001] The invention relates to the technical field of computer augmented reality, in particular to a multi-interaction method and device based on Kinect and Unity3D. Background technique [0002] Augmented Reality technology (Augmented Reality) was first proposed in the 1990s, and now it has been widely used in many aspects such as medical care, education, industry, and commerce. A more general definition of augmented reality was proposed by Ronald Azuma of the University of North Carolina in 1997, including three main aspects: Combines real and virtual, Interactive in real time, Registered in 3D ). The technology superimposes a virtual scene on top of the real one on the screen and allows participants to interact with the virtual scene. At present, the implementation process of augmented reality is generally as follows: 1) Obtain the scene image through the image acquisition device; 2) Identify and track the calibration image or text in the scene, calcula...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F3/01
CPCG06F3/017
Inventor 王虓郭新宇吴升温维亮王传宇
Owner BEIJING RES CENT OF INTELLIGENT EQUIP FOR AGRI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products