Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Automated steering systems and methods for a robotic endoscope

An endoscope and robot technology, applied in the field of automatic steering systems and methods for robot endoscopes

Active Publication Date: 2018-10-23
BIO MEDICAL ENG (HK) LTD
View PDF2 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, to date, no single method has been sufficiently reliable to augment or replace the know-how provided by skilled surgeons

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Automated steering systems and methods for a robotic endoscope
  • Automated steering systems and methods for a robotic endoscope
  • Automated steering systems and methods for a robotic endoscope

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0199] Embodiment - Soft Robotic Endoscope Including Automated Steering Control System

[0200] Figure 11A block diagram illustrating a robotic colonoscope including the automated steering control system of the present disclosure is provided. As shown in the figure, the robotic colonoscopy system includes a guide portion 1101 (ie, a steerable bend section) to which a plurality of sensors can be affixed. In this particular embodiment, the plurality of sensors includes imaging sensors (eg, cameras) and touch (or distance) sensors. The movement of the steerable bending section is controlled by the actuation unit 1103 . In this embodiment, the degree of motion controlled by the actuation unit includes at least rotational motion around the pitch axis and the yaw axis. In some embodiments, the degree of motion controlled by the actuation unit may also include, for example, rotational or translational movement around the roll axis (for example, forward or backward). In some case...

Embodiment approach 1

[0244] Embodiment 1. A control system for providing an adaptive steering control output signal for steering a robotic endoscope, the control system comprising: a) a first image sensor configured to capture a series of A first input data stream of two or more images; and b) one or more processors, individually or collectively, configured to analyze the input data derived from said first input data stream based on the use of a machine learning architecture. analysis of the data to generate a steering control output signal, wherein the steering control output signal adapts in real time to changes in data derived from the first input data stream.

Embodiment approach 2

[0245]Embodiment 2. The control system of embodiment 1, further comprising at least a second image sensor configured to capture at least one of a series of two or more images comprising the cavity. A second input data stream, wherein the steering control output signal is generated based on an analysis of data derived from the first input data stream and the at least second input data stream using a machine learning architecture, and wherein the steering control The output signal is adapted in real time to changes in data derived from said first input data stream or said at least second input data stream.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Systems and methods for automated steering control of a robotic endoscope, e.g., a colonoscope, are provided. The control system may comprise: a) a first image sensor configured to capture a first input data stream comprising a series of two or more images of a lumen; and b) one or more processors that are individually or collectively configured to generate a steering control output signal based on an analysis of data derived from the first input data stream using a machine learning architecture, wherein the steering control output signal adapts to changes in the data of the first input data stream in real time.

Description

[0001] cross reference [0002] This application claims U.S. Provisional Application No. 62 / 484,754 filed April 12, 2017, U.S. Provisional Application No. 62 / 535,386 filed July 21, 2017, U.S. Provisional Application No. 15 / 928,843 filed March 22, 2018 and PCT Application No. PCT / CN2018 / 080034, filed March 22, 2018, which is hereby incorporated by reference. Background technique [0003] Routine colonoscopy has been used for colorectal cancer screening. However, because the colon is a tortuous, flexible tube with many internal ridges and sharp bends, advancement of the colonoscope is difficult and is often complicated by excessive entanglement and stretching of the colon. This can cause significant pain and discomfort, as well as a significant risk of colon overdilation or even perforation. Thus, skilled manipulation of a colonoscope requires a high degree of technical expertise from the physician. [0004] Robotic technology has the advantage that it can be incorporated in...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): A61B1/31A61B34/30A61B34/10G06N3/04G06V10/44G06V10/764
CPCA61B34/10A61B34/30A61B1/00006A61B1/00009A61B1/31A61B2034/101A61B2034/303A61B2034/301G06N3/045G06T2207/30172A61B1/0016A61B5/065A61B2576/00A61B34/32A61B2090/065G06T7/64G06T7/13G06T7/174G06N3/08G06T2207/10068G06T2207/20081G06T2207/20084G06T2207/30028G05B2219/45118G16H30/40G06V10/44G06V2201/03G06V10/82A61B1/000096A61B1/00097G06V10/764G06N3/044
Inventor 杨重光朱棣文陈文岭彭天泽
Owner BIO MEDICAL ENG (HK) LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products