32 resultados para Realtà Aumentata Augmented Reality App Vuforia Image Targeting Unity XCode iOS

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image-guided, computer-assisted neurosurgery has emerged to improve localization and targeting, to provide a better anatomic definition of the surgical field, and to decrease invasiveness. Usually, in image-guided surgery, a computer displays the surgical field in a CT/MR environment, using axial, coronal or sagittal views, or even a 3D representation of the patient. Such a system forces the surgeon to look away from the surgical scene to the computer screen. Moreover, this kind of information, being pre-operative imaging, can not be modified during the operation, so it remains valid for guidance in the first stage of the surgical procedure, and mainly for rigid structures like bones. In order to solve the two constraints mentioned before, we are developing an ultrasoundguided surgical microscope. Such a system takes the advantage that surgical microscopy and ultrasound systems are already used in neurosurgery, so it does not add more complexity to the surgical procedure. We have integrated an optical tracking device in the microscope and an augmented reality overlay system with which we avoid the need to look away from the scene, providing correctly aligned surgical images with sub-millimeter accuracy. In addition to the standard CT and 3D views, we are able to track an ultrasound probe, and using a previous calibration and registration of the imaging, the image obtained is correctly projected to the overlay system, so the surgeon can always localize the target and verify the effects of the intervention. Several tests of the system have been already performed to evaluate the accuracy, and clinical experiments are currently in progress in order to validate the clinical usefulness of the system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: The aim of this study is to implement augmented reality in real-time image-guided interstitial brachytherapy to allow an intuitive real-time intraoperative orientation. METHODS AND MATERIALS: The developed system consists of a common video projector, two high-resolution charge coupled device cameras, and an off-the-shelf notebook. The projector was used as a scanning device by projecting coded-light patterns to register the patient and superimpose the operating field with planning data and additional information in arbitrary colors. Subsequent movements of the nonfixed patient were detected by means of stereoscopically tracking passive markers attached to the patient. RESULTS: In a first clinical study, we evaluated the whole process chain from image acquisition to data projection and determined overall accuracy with 10 patients undergoing implantation. The described method enabled the surgeon to visualize planning data on top of any preoperatively segmented and triangulated surface (skin) with direct line of sight during the operation. Furthermore, the tracking system allowed dynamic adjustment of the data to the patient's current position and therefore eliminated the need for rigid fixation. Because of soft-part displacement, we obtained an average deviation of 1.1 mm by moving the patient, whereas changing the projector's position resulted in an average deviation of 0.9 mm. Mean deviation of all needles of an implant was 1.4 mm (range, 0.3-2.7 mm). CONCLUSIONS: The developed low-cost augmented-reality system proved to be accurate and feasible in interstitial brachytherapy. The system meets clinical demands and enables intuitive real-time intraoperative orientation and monitoring of needle implantation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Presenting visual feedback for image-guided surgery on a monitor requires the surgeon to perform time-consuming comparisons and diversion of sight and attention away from the patient. Deficiencies in previously developed augmented reality systems for image-guided surgery have, however, prevented the general acceptance of any one technique as a viable alternative to monitor displays. This work presents an evaluation of the feasibility and versatility of a novel augmented reality approach for the visualisation of surgical planning and navigation data. The approach, which utilises a portable image overlay device, was evaluated during integration into existing surgical navigation systems and during application within simulated navigated surgery scenarios.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During endoscopic surgery, it is difficult to ascertain the anatomical landmarks once the anatomy is fiddled with or if the operating area is filled with blood. An augmented reality system will enhance the endoscopic view and further enable surgeons to view hidden critical structures or the results of preoperative planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: In this paper we present a landmark-based augmented reality (AR) endoscope system for endoscopic paranasal and transnasal surgeries along with fast and automatic calibration and registration procedures for the endoscope. METHODS: Preoperatively the surgeon selects natural landmarks or can define new landmarks in CT volume. These landmarks are overlaid, after proper registration of preoperative CT to the patient, on the endoscopic video stream. The specified name of the landmark, along with selected colour and its distance from the endoscope tip, is also augmented. The endoscope optics are calibrated and registered by fast and automatic methods. Accuracy of the system is evaluated in a metallic grid and cadaver set-up. RESULTS: Root mean square (RMS) error of the system is 0.8 mm in a controlled laboratory set-up (metallic grid) and was 2.25 mm during cadaver studies. CONCLUSIONS: A novel landmark-based AR endoscope system is implemented and its accuracy is evaluated. Augmented landmarks will help the surgeon to orientate and navigate the surgical field. Studies prove the capability of the system for the proposed application. Further clinical studies are planned in near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intraoperative laparoscopic calibration remains a challenging task. In this work we present a new method and instrumentation for intraoperative camera calibration. Contrary to conventional calibration methods, the proposed technique allows intraoperative laparoscope calibration from single perspective observations, resulting in a standardized scheme for calibrating in a clinical scenario. Results show an average displacement error of 0.52 ± 0.19 mm, indicating sufficient accuracy for clinical use. Additionally, the proposed method is validated clinically by performing a calibration during the surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: In this paper, we present a new method for the calibration of a microscope and its registration using an active optical tracker. METHODS: Practically, both operations are done simultaneously by moving an active optical marker within the field of view of the two devices. The IR LEDs composing the marker are first segmented from the microscope images. By knowing their corresponding three-dimensional (3D) position in the optical tracker reference system, it is possible to find the transformation matrix between the referential of the two devices. Registration and calibration parameters can be extracted directly from that transformation. In addition, since the zoom and focus can be modified by the surgeon during the operation, we propose a spline based method to update the camera model to the new setup. RESULTS: The proposed technique is currently being used in an augmented reality system for image-guided surgery in the fields of ear, nose and throat (ENT) and craniomaxillofacial surgeries. CONCLUSIONS: The results have proved to be accurate and the technique is a fast, dynamic and reliable way to calibrate and register the two devices in an OR environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Limitations associated with the visual information provided to surgeons during laparoscopic surgery increases the difficulty of procedures and thus, reduces clinical indications and increases training time. This work presents a novel augmented reality visualization approach that aims to improve visual data supplied for the targeting of non visible anatomical structures in laparoscopic visceral surgery. The approach aims to facilitate the localisation of hidden structures with minimal damage to surrounding structures and with minimal training requirements. The proposed augmented reality visualization approach incorporates endoscopic images overlaid with virtual 3D models of underlying critical structures in addition to targeting and depth information pertaining to targeted structures. Image overlay was achieved through the implementation of camera calibration techniques and integration of the optically tracked endoscope into an existing image guidance system for liver surgery. The approach was validated in accuracy, clinical integration and targeting experiments. Accuracy of the overlay was found to have a mean value of 3.5 mm ± 1.9 mm and 92.7% of targets within a liver phantom were successfully located laparoscopically by non trained subjects using the approach.