Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Medical model interaction visualization method and system based on gesture recognition

A gesture recognition and gesture technology, applied in the field of computer vision, can solve problems such as insufficient interaction

Pending Publication Date: 2020-09-08
GENERAL HOSPITAL OF PLA
View PDF9 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] But at present, for the display of medical 3D models, when operating and controlling the medical 3D models, the most basic control method of keyboard and mouse is still used to realize human-computer interaction, which is not convenient enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Medical model interaction visualization method and system based on gesture recognition
  • Medical model interaction visualization method and system based on gesture recognition

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0046] see figure 1 , the present embodiment provides a method for interactive visualization of medical models based on gesture recognition, the method for interactive visualization of medical models based on gesture recognition includes the following steps:

[0047] S101, collecting images of the operator, and capturing visual images of the operator's hand movements;

[0048] It should be noted that the above steps are to collect images of the operator through the binocular camera, and obtain the calibrated stereo image through stereo calibration, so as to complete the stereo matching and obtain the parallax image. The depth image is acquired through calculation to capture the visual image of the operator's hand movement; wherein, the visual image of the hand movement includes a left visual image and a right visual image.

[0049] S102, using a gesture segmentation algorithm to perform gesture segmentation on the captured hand movement visual image;

[0050] It should be no...

no. 2 example

[0058] see figure 2 , the present embodiment provides a method for interactive visualization of medical models based on gesture recognition, the method for interactive visualization of medical models based on gesture recognition includes the following steps:

[0059] S101, realize the rendering of the control model, that is, the two-dimensional image information stored in the computer can be displayed in three-dimensional form through operations such as dragging and moving the mouse, and at the same time, it can also cooperate with the keyboard to complete the selection of a certain area operate. The model rendering function is the main function of this system;

[0060] S102, collect the visual image of the operator's action through the binocular camera, and obtain the calibrated stereo image through stereo calibration, thereby completing the stereo matching, obtaining the parallax image, and then combining the internal parameters and external parameters of the camera, using...

no. 3 example

[0067] This embodiment provides a gesture recognition-based interactive visualization system for medical models, which includes:

[0068] The binocular camera is used to collect images of the operator and capture visual images of their hand movements;

[0069] A gesture segmentation module, configured to use a preset gesture segmentation algorithm to perform gesture segmentation on the hand movement visual image captured by the binocular camera;

[0070] The gesture analysis and tracking module is used to analyze the change of the operator's gesture based on the gesture segmentation result of the hand movement visual image by the gesture segmentation module;

[0071] The gesture recognition and control module is used to control the operated medical model in real time according to the change of the operator's gesture.

[0072] The gesture recognition-based medical model interactive visualization system of this embodiment corresponds to the gesture recognition-based medical mod...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a medical model interaction visualization method and system based on gesture recognition, and the method comprises the steps: collecting images of an operator, and capturing a hand motion visual image of the operator; performing gesture segmentation on the captured hand action visual image by adopting a preset gesture segmentation algorithm; analyzing the change of the gesture of the operator based on the gesture segmentation result of the hand action visual image; and controlling the operated medical model in real time according to the change of the gesture of the operator. Three-dimensional reconstruction and gesture recognition are combined, the image processing technology and the man-machine interaction technology are further developed in the medical field, and the method and system can be used for displaying a medical three-dimensional model in a gesture control mode.

Description

technical field [0001] The present invention relates to the technical field of computer vision, in particular to a method and system for interactive visualization of medical models based on gesture recognition Background technique [0002] As one of today's important science and technology, human-computer interaction technology has attracted much attention. Both the national 973 plan and the medium and long-term scientific and technological development outline propose to support "harmonious human-computer interaction theory and basic research on intelligent information processing", and focus on and give priority to the development of "virtual reality technology" and "intelligent perception technology". Mechanical input devices such as keyboards and mice are difficult to express in 3D and with a high degree of freedom. This interaction method is actually inconvenient to some extent. With the deepening of research work, more and more people have begun to devote themselves to ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/34G06F3/01G06T17/00
CPCG06F3/017G06F3/011G06T17/00G06V40/20G06V10/267
Inventor 鲁媛媛赵静何昆仑李俊来肖若秀汪洋常秀娟周超张伟肖伟厚
Owner GENERAL HOSPITAL OF PLA
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products