41 resultados para Virtual and Augmented Reality
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
BACKGROUND: In this paper we present a landmark-based augmented reality (AR) endoscope system for endoscopic paranasal and transnasal surgeries along with fast and automatic calibration and registration procedures for the endoscope. METHODS: Preoperatively the surgeon selects natural landmarks or can define new landmarks in CT volume. These landmarks are overlaid, after proper registration of preoperative CT to the patient, on the endoscopic video stream. The specified name of the landmark, along with selected colour and its distance from the endoscope tip, is also augmented. The endoscope optics are calibrated and registered by fast and automatic methods. Accuracy of the system is evaluated in a metallic grid and cadaver set-up. RESULTS: Root mean square (RMS) error of the system is 0.8 mm in a controlled laboratory set-up (metallic grid) and was 2.25 mm during cadaver studies. CONCLUSIONS: A novel landmark-based AR endoscope system is implemented and its accuracy is evaluated. Augmented landmarks will help the surgeon to orientate and navigate the surgical field. Studies prove the capability of the system for the proposed application. Further clinical studies are planned in near future.
Resumo:
Intraoperative laparoscopic calibration remains a challenging task. In this work we present a new method and instrumentation for intraoperative camera calibration. Contrary to conventional calibration methods, the proposed technique allows intraoperative laparoscope calibration from single perspective observations, resulting in a standardized scheme for calibrating in a clinical scenario. Results show an average displacement error of 0.52 ± 0.19 mm, indicating sufficient accuracy for clinical use. Additionally, the proposed method is validated clinically by performing a calibration during the surgery.
Resumo:
Image-guided, computer-assisted neurosurgery has emerged to improve localization and targeting, to provide a better anatomic definition of the surgical field, and to decrease invasiveness. Usually, in image-guided surgery, a computer displays the surgical field in a CT/MR environment, using axial, coronal or sagittal views, or even a 3D representation of the patient. Such a system forces the surgeon to look away from the surgical scene to the computer screen. Moreover, this kind of information, being pre-operative imaging, can not be modified during the operation, so it remains valid for guidance in the first stage of the surgical procedure, and mainly for rigid structures like bones. In order to solve the two constraints mentioned before, we are developing an ultrasoundguided surgical microscope. Such a system takes the advantage that surgical microscopy and ultrasound systems are already used in neurosurgery, so it does not add more complexity to the surgical procedure. We have integrated an optical tracking device in the microscope and an augmented reality overlay system with which we avoid the need to look away from the scene, providing correctly aligned surgical images with sub-millimeter accuracy. In addition to the standard CT and 3D views, we are able to track an ultrasound probe, and using a previous calibration and registration of the imaging, the image obtained is correctly projected to the overlay system, so the surgeon can always localize the target and verify the effects of the intervention. Several tests of the system have been already performed to evaluate the accuracy, and clinical experiments are currently in progress in order to validate the clinical usefulness of the system.
Resumo:
PURPOSE: The aim of this study is to implement augmented reality in real-time image-guided interstitial brachytherapy to allow an intuitive real-time intraoperative orientation. METHODS AND MATERIALS: The developed system consists of a common video projector, two high-resolution charge coupled device cameras, and an off-the-shelf notebook. The projector was used as a scanning device by projecting coded-light patterns to register the patient and superimpose the operating field with planning data and additional information in arbitrary colors. Subsequent movements of the nonfixed patient were detected by means of stereoscopically tracking passive markers attached to the patient. RESULTS: In a first clinical study, we evaluated the whole process chain from image acquisition to data projection and determined overall accuracy with 10 patients undergoing implantation. The described method enabled the surgeon to visualize planning data on top of any preoperatively segmented and triangulated surface (skin) with direct line of sight during the operation. Furthermore, the tracking system allowed dynamic adjustment of the data to the patient's current position and therefore eliminated the need for rigid fixation. Because of soft-part displacement, we obtained an average deviation of 1.1 mm by moving the patient, whereas changing the projector's position resulted in an average deviation of 0.9 mm. Mean deviation of all needles of an implant was 1.4 mm (range, 0.3-2.7 mm). CONCLUSIONS: The developed low-cost augmented-reality system proved to be accurate and feasible in interstitial brachytherapy. The system meets clinical demands and enables intuitive real-time intraoperative orientation and monitoring of needle implantation.
Resumo:
During endoscopic surgery, it is difficult to ascertain the anatomical landmarks once the anatomy is fiddled with or if the operating area is filled with blood. An augmented reality system will enhance the endoscopic view and further enable surgeons to view hidden critical structures or the results of preoperative planning.