4 resultados para Robotic Navigation
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Image-guided microsurgery requires accuracies an order of magnitude higher than today's navigation systems provide. A critical step toward the achievement of such low-error requirements is a highly accurate and verified patient-to-image registration. With the aim of reducing target registration error to a level that would facilitate the use of image-guided robotic microsurgery on the rigid anatomy of the head, we have developed a semiautomatic fiducial detection technique. Automatic force-controlled localization of fiducials on the patient is achieved through the implementation of a robotic-controlled tactile search within the head of a standard surgical screw. Precise detection of the corresponding fiducials in the image data is realized using an automated model-based matching algorithm on high-resolution, isometric cone beam CT images. Verification of the registration technique on phantoms demonstrated that through the elimination of user variability, clinically relevant target registration errors of approximately 0.1 mm could be achieved.
Resumo:
Computer-aided microscopic surgery of the lateral skull base is a rare intervention in daily practice. It is often a delicate and difficult minimally invasive intervention, since orientation between the petrous bone and the petrous bone apex is often challenging. In the case of aural atresia or tumors the normal anatomical landmarks are often absent, making orientation more difficult. Navigation support, together with imaging techniques such as CT, MR and angiography, enable the surgeon in such cases to perform the operation more accurately and, in some cases, also in a shorter time. However, there are no internationally standardised indications for navigated surgery on the lateral skull base. Miniaturised robotic systems are still in the initial validation phase.
Resumo:
The application of image-guided systems with or without support by surgical robots relies on the accuracy of the navigation process, including patient-to-image registration. The surgeon must carry out the procedure based on the information provided by the navigation system, usually without being able to verify its correctness beyond visual inspection. Misleading surrogate parameters such as the fiducial registration error are often used to describe the success of the registration process, while a lack of methods describing the effects of navigation errors, such as those caused by tracking or calibration, may prevent the application of image guidance in certain accuracy-critical interventions. During minimally invasive mastoidectomy for cochlear implantation, a direct tunnel is drilled from the outside of the mastoid to a target on the cochlea based on registration using landmarks solely on the surface of the skull. Using this methodology, it is impossible to detect if the drill is advancing in the correct direction and that injury of the facial nerve will be avoided. To overcome this problem, a tool localization method based on drilling process information is proposed. The algorithm estimates the pose of a robot-guided surgical tool during a drilling task based on the correlation of the observed axial drilling force and the heterogeneous bone density in the mastoid extracted from 3-D image data. We present here one possible implementation of this method tested on ten tunnels drilled into three human cadaver specimens where an average tool localization accuracy of 0.29 mm was observed.
Resumo:
BACKGROUND Stereotactic navigation technology can enhance guidance during surgery and enable the precise reproduction of planned surgical strategies. Currently, specific systems (such as the CAS-One system) are available for instrument guidance in open liver surgery. This study aims to evaluate the implementation of such a system for the targeting of hepatic tumors during robotic liver surgery. MATERIAL AND METHODS Optical tracking references were attached to one of the robotic instruments and to the robotic endoscopic camera. After instrument and video calibration and patient-to-image registration, a virtual model of the tracked instrument and the available three-dimensional images of the liver were displayed directly within the robotic console, superimposed onto the endoscopic video image. An additional superimposed targeting viewer allowed for the visualization of the target tumor, relative to the tip of the instrument, for an assessment of the distance between the tumor and the tool for the realization of safe resection margins. RESULTS Two cirrhotic patients underwent robotic navigated atypical hepatic resections for hepatocellular carcinoma. The augmented endoscopic view allowed for the definition of an accurate resection margin around the tumor. The overlay of reconstructed three-dimensional models was also used during parenchymal transection for the identification of vascular and biliary structures. Operative times were 240 min in the first case and 300 min in the second. There were no intraoperative complications. CONCLUSIONS The da Vinci Surgical System provided an excellent platform for image-guided liver surgery with a stable optic and instrumentation. Robotic image guidance might improve the surgeon's orientation during the operation and increase accuracy in tumor resection. Further developments of this technological combination are needed to deal with organ deformation during surgery.