Spinal minimally invasive surgery navigation method and system based on augmented reality

A minimally invasive surgery and augmented reality technology, applied in the field of surgical navigation, can solve the problems of invisible surgery scenes, visual fatigue of doctors, and increase the difficulty of surgery, so as to avoid surgery errors, enhance comfort, and improve the effect of surgery success rate.

Inactive Publication Date: 2019-06-25
SUZHOU UNIV
View PDF4 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] At present, doctors can observe any anatomical structure superimposed on the patient's body in real time by wearing an HMD. During the operation, only the scene captured by the camera can be seen, and the real operation scene cannot be seen at all, which will greatly increase the risk of surgery
Most operations are more complicated and the operation time is relatively long. If the doctor wears the head-mounted display for a long time, it will bring discomfort to t

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Spinal minimally invasive surgery navigation method and system based on augmented reality

Examples

Experimental program
Comparison scheme
Effect test

Example Embodiment

[0040] Example one

[0041] Such as figure 1 Shown is the augmented reality-based spine minimally invasive surgery navigation method in the first embodiment of the present invention. The method includes the following steps:

[0042] S1, reconstruct a virtual three-dimensional image of the patient's spine.

[0043] Specifically, it includes: reconstructing a virtual three-dimensional image of the patient's spine based on a set of CT images.

[0044] S2. The virtual three-dimensional image is registered with the patient space to obtain the position of the virtual lesion point in the virtual three-dimensional image in the patient space.

[0045] The registration of the virtual three-dimensional image with the patient space specifically includes: setting four non-coplanar landmark points in the virtual three-dimensional image and the patient space respectively, and combining the four landmark points in the virtual three-dimensional image with the four landmark points in the patient space. ...

Example Embodiment

[0051] Example two

[0052] A spine minimally invasive surgery navigation system based on augmented reality, which includes:

[0053] Image reconstruction module, used to reconstruct a virtual three-dimensional image of the patient's spine;

[0054] Specifically, it includes: reconstructing a virtual three-dimensional image of the patient's spine based on a set of CT images.

[0055] The first registration module is used to register the virtual three-dimensional image with the patient space to obtain the position of the virtual lesion point in the virtual three-dimensional image in the patient space;

[0056] The registration of the virtual three-dimensional image with the patient space specifically includes: setting four non-coplanar landmark points in the virtual three-dimensional image and the patient space respectively, and combining the four landmark points in the virtual three-dimensional image with the four landmark points in the patient space. The two mark points are overlapped...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a spinal minimally invasive surgery navigation method and system based on augmented reality. The method comprises the following steps: reconstructing a virtual three-dimensional image of the spine of a patient; registering the virtual three-dimensional image with the patient space to obtain the position of a virtual lesion in the virtual three-dimensional image; projectinga surgical path defined in the virtual three-dimensional image into the patient space; generating a DRR image of a preoperative CT image and conducting real-time registration with an intraoperative X-ray image to determine the actual lesion point; controlling a robot to clamp a surgical instrument to perform the operation on the actual lesion point; obtaining a real surgical scene in real time during the operation, and outputting the acquired video signal on a 3D display. According to the method and system, the preoperative surgical path planning can be achieved to accurately locate the lesionpoint and achieve real-time monitoring of the operation. Real-time tracking is achieved during the operation to avoid operation errors and improve the operation success rate.

Description

technical field [0001] The invention relates to the technical field of surgical navigation, in particular to a navigation method and system for minimally invasive spinal surgery based on augmented reality. Background technique [0002] Surgical navigation IGS (Image Guided Surgery) refers to the use of medical imaging equipment and computer imaging methods by doctors to perform three-dimensional reconstruction and visualization processing on patients' multi-modal image data before surgery, to obtain three-dimensional models, and to formulate reasonable and quantitative surgical plans , to carry out preoperative simulation; through the registration operation during the operation, the 3D model, the actual body position of the patient, and the real-time position of the surgical instrument in space are unified in one coordinate system, and the position of the surgical instrument in space is real-time monitored using the 3D positioning system. Acquisition and display, the doctor ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): A61B34/10A61B34/20A61B34/00A61B34/30A61B90/00
Inventor 张峰峰陈龙孙立宁
Owner SUZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products