Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Man-machine interaction integrated device based on Leap Motion equipment

A somatosensory device and human-computer interaction technology, which is applied in the field of human-computer interaction, can solve problems such as recognition and recognition accuracy difficulties, and achieve the effect of powerful system functions and making up for defects in recognition accuracy

Inactive Publication Date: 2017-05-31
GUANGZHOU MIDSTERO TECH CO LTD
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In order to solve the defects in the distance and recognition accuracy of a single somatosensory device, the purpose of the present invention is to provide a human-computer interaction integrated device based on somatosensory equipment, which overcomes the difficulties in long-distance somatosensory recognition and recognition accuracy, and realizes long-distance, The recognition operation of close range and various gestures or tools greatly enhances the somatosensory enjoyment of human-computer interaction in 3D virtual reality

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Man-machine interaction integrated device based on Leap Motion equipment
  • Man-machine interaction integrated device based on Leap Motion equipment
  • Man-machine interaction integrated device based on Leap Motion equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0066] like Figure 5 As shown, the human-computer interaction integration device based on somatosensory equipment described in the present invention is applied to a 3D keyboard, that is, to perform a click operation on a 3D virtual image, wherein the touch recognition area 21 and the touch prediction area 22, the specific implementation method is as follows :

[0067] (1) Start camera 14 and Kinect camera 11 to collect information, judge which equipment identification tracking area the user is in, and carry out human eye tracking identification;

[0068] (2) The display terminal 12 is switched to the corresponding 3D playback mode;

[0069] (3) Start the Leap Motion somatosensory device 13 to collect gesture information, and the calculation processing unit 152 performs color space separation, filtering and segmentation extraction contour processing on the image collected, that is, calculates the human body finger according to the relative positional relationship between the ...

Embodiment 2

[0071] like Figure 6 As shown, the human-computer interaction integration device based on somatosensory equipment described in the present invention is used in a 3D drawing board, and the specific implementation method is as follows:

[0072] (1) start the 2D mode of the display terminal 12;

[0073] (2) start Kinect camera 11 and camera 14, gather information, judge user to be in which equipment's human eye recognition tracking range, carry out human eye tracking, and switch to under the corresponding 3D playback mode;

[0074] (3) Start the Leap Motion somatosensory device 13, the calculation processing unit 152 carries out color space separation to the image collected, filtering and segmenting and extracting contour processing, and obtains feature points, that is, according to the relative positional relationship between the human body and the detection device, calculate the holding time The spatial distribution of the feature points of the hand of the pen is combined wit...

Embodiment 3

[0078] like Figure 7 As shown, the human-computer interaction integration device based on somatosensory equipment described in the present invention is used in 3D model gesture control, and the specific implementation method is as follows:

[0079] (1) start the 2D mode of the display terminal 12;

[0080] (2) start Kinect camera 11 and camera 14, gather information, judge user to be in which equipment's human eye recognition tracking range, carry out human eye tracking, and switch to under the corresponding 3D playback mode;

[0081] (3) Start the Leap Motion somatosensory device 13, the calculation processing unit 152 carries out color space separation to the image collected in real time, filters and segments and extracts contour processing, obtains feature points, that is, calculates according to the relative positional relationship between the human body and the detection device The spatial distribution of the feature points of both hands is integrated with the spatial d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a man-machine interaction integrated device based on Leap Motion equipment. The man-machine interaction integrated device comprises a Kinect camera, the Leap Motion equipment, a camera, a processing terminal and a display terminal, wherein the Kinect camera is used for acquiring skeleton image information and deep image information of a user, the Leap Motion equipment is used for acquiring hand or tool image information of the user, the camera is used for acquiring image information positioned in a visual blind area of the Kinect camera and is positioned on one side of the Kinect camera, the processing terminal is used for receiving and processing the skeleton image information, the deep image information, the hand or tool image information and the image information positioned in the visual blind area of the Kinect camera, the display terminal is used for displaying an initial image and dynamically displaying the image information processed by the processing terminal to realize man-machine interaction operation. By the man-machine interaction integrated device, long-distance, short-distance and various gestures or tools can be recognized, and man-machine interaction Kinect enjoyment of 3D (three-dimensional) virtual reality is greatly improved.

Description

technical field [0001] The invention belongs to the technical field of human-computer interaction, and in particular relates to a human-computer interaction integration device based on somatosensory equipment. Background technique [0002] Virtual reality, referred to as VR technology, is a three-dimensional space generated by computer simulation, providing simulations of the user's vision, hearing, touch and other senses, allowing users to experience the virtual world as if they were there. The concept of virtual reality (VR) has attracted widespread attention since it was proposed in the 1950s, and with the increasing maturity of hardware technology and video technology, the VR industry has begun to start the commercialization process. Embedding the function of recognition and tracking in the naked-eye 3D display will give people a brand-new human-computer interaction VR experience, such as applications in games, social networking and other fields. In particular, the naked...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F3/01G06F3/0481
CPCG06F3/013G06F3/017G06F3/04815G06F2203/012
Inventor 李焜阳范杭张瀚韬吴逸畅周延桂刘婷婷
Owner GUANGZHOU MIDSTERO TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products