72 resultados para computer-based technology
Resumo:
This paper presents a novel technique to create a computerized fluoroscopy with zero-dose image updates for computer-assisted fluoroscopy-based close reduction and osteosynthesis of diaphyseal fracture of femurs. With the novel technique, repositioning of bone fragments during close fracture reduction will lead to image updates in each acquired imaging plane, which is equivalent to using several fluoroscopes simultaneously from different directions but without any X-ray radiation. Its application facilitates the whole fracture reduction and osteosynthesis procedure when combining with the existing leg length and antetorsion restoration methods and may result in great reduction of the X-ray radiation to the patient and to the surgical team. In this paper, we present the approach for achieving such a technique and the experimental results with plastic bones.
Resumo:
A new system for computer-aided corrective surgery of the jaws has been developed and introduced clinically. It combines three-dimensional (3-D) surgical planning with conventional dental occlusion planning. The developed software allows simulating the surgical correction on virtual 3-D models of the facial skeleton generated from computed tomography (CT) scans. Surgery planning and simulation include dynamic cephalometry, semi-automatic mirroring, interactive cutting of bone and segment repositioning. By coupling the software with a tracking system and with the help of a special registration procedure, we are able to acquire dental occlusion plans from plaster model mounts. Upon completion of the surgical plan, the setup is used to manufacture positioning splints for intraoperative guidance. The system provides further intraoperative assistance with the help of a display showing jaw positions and 3-D positioning guides updated in real time during the surgical procedure. The proposed approach offers the advantages of 3-D visualization and tracking technology without sacrificing long-proven cast-based techniques for dental occlusion evaluation. The system has been applied on one patient. Throughout this procedure, we have experienced improved assessment of pathology, increased precision, and augmented control.
Resumo:
Surgical navigation systems visualize the positions and orientations of surgical instruments and implants as graphical overlays onto a medical image of the operated anatomy on a computer monitor. The orthopaedic surgical navigation systems could be categorized according to the image modalities that are used for the visualization of surgical action. In the so-called CT-based systems or 'surgeon-defined anatomy' based systems, where a 3D volume or surface representation of the operated anatomy could be constructed from the preoperatively acquired tomographic data or through intraoperatively digitized anatomy landmarks, a photorealistic rendering of the surgical action has been identified to greatly improve usability of these navigation systems. However, this may not hold true when the virtual representation of surgical instruments and implants is superimposed onto 2D projection images in a fluoroscopy-based navigation system due to the so-called image occlusion problem. Image occlusion occurs when the field of view of the fluoroscopic image is occupied by the virtual representation of surgical implants or instruments. In these situations, the surgeon may miss part of the image details, even if transparency and/or wire-frame rendering is used. In this paper, we propose to use non-photorealistic rendering to overcome this difficulty. Laboratory testing results on foamed plastic bones during various computer-assisted fluoroscopybased surgical procedures including total hip arthroplasty and long bone fracture reduction and osteosynthesis are shown.
Resumo:
PURPOSE: To assess the literature on accuracy and clinical performance of computer technology applications in surgical implant dentistry. MATERIALS AND METHODS: Electronic and manual literature searches were conducted to collect information about (1) the accuracy and (2) clinical performance of computer-assisted implant systems. Meta-regression analysis was performed for summarizing the accuracy studies. Failure/complication rates were analyzed using random-effects Poisson regression models to obtain summary estimates of 12-month proportions. RESULTS: Twenty-nine different image guidance systems were included. From 2,827 articles, 13 clinical and 19 accuracy studies were included in this systematic review. The meta-analysis of the accuracy (19 clinical and preclinical studies) revealed a total mean error of 0.74 mm (maximum of 4.5 mm) at the entry point in the bone and 0.85 mm at the apex (maximum of 7.1 mm). For the 5 included clinical studies (total of 506 implants) using computer-assisted implant dentistry, the mean failure rate was 3.36% (0% to 8.45%) after an observation period of at least 12 months. In 4.6% of the treated cases, intraoperative complications were reported; these included limited interocclusal distances to perform guided implant placement, limited primary implant stability, or need for additional grafting procedures. CONCLUSION: Differing levels and quantity of evidence were available for computer-assisted implant placement, revealing high implant survival rates after only 12 months of observation in different indications and a reasonable level of accuracy. However, future long-term clinical data are necessary to identify clinical indications and to justify additional radiation doses, effort, and costs associated with computer-assisted implant surgery.
Resumo:
Introduction: The aim of this systematic review was to analyze the dental literature regarding accuracy and clinical application in computer-guided template-based implant dentistry. Materials and methods: An electronic literature search complemented by manual searching was performed to gather data on accuracy and surgical, biological and prosthetic complications in connection with computer-guided implant treatment. For the assessment of accuracy meta-regression analysis was performed. Complication rates are descriptively summarized. Results: From 3120 titles after the literature search, eight articles met the inclusion criteria regarding accuracy and 10 regarding the clinical performance. Meta-regression analysis revealed a mean deviation at the entry point of 1.07 mm (95% CI: 0.76-1.22 mm) and at the apex of 1.63 mm (95% CI: 1.26-2 mm). No significant differences between the studies were found regarding method of template production or template support and stabilization. Early surgical complications occurred in 9.1%, early prosthetic complications in 18.8% and late prosthetic complications in 12% of the cases. Implant survival rates of 91-100% after an observation time of 12-60 months are reported in six clinical studies with 537 implants mainly restored immediately after flapless implantation procedures. Conclusion: Computer-guided template-based implant placement showed high implant survival rates ranging from 91% to 100%. However, a considerable number of technique-related perioperative complications were observed. Preclinical and clinical studies indicated a reasonable mean accuracy with relatively high maximum deviations. Future research should be directed to increase the number of clinical studies with longer observation periods and to improve the systems in terms of perioperative handling, accuracy and prosthetic complications.
Resumo:
Quantitative characterisation of carotid atherosclerosis and classification into symptomatic or asymptomatic is crucial in planning optimal treatment of atheromatous plaque. The computer-aided diagnosis (CAD) system described in this paper can analyse ultrasound (US) images of carotid artery and classify them into symptomatic or asymptomatic based on their echogenicity characteristics. The CAD system consists of three modules: a) the feature extraction module, where first-order statistical (FOS) features and Laws' texture energy can be estimated, b) the dimensionality reduction module, where the number of features can be reduced using analysis of variance (ANOVA), and c) the classifier module consisting of a neural network (NN) trained by a novel hybrid method based on genetic algorithms (GAs) along with the back propagation algorithm. The hybrid method is able to select the most robust features, to adjust automatically the NN architecture and to optimise the classification performance. The performance is measured by the accuracy, sensitivity, specificity and the area under the receiver-operating characteristic (ROC) curve. The CAD design and development is based on images from 54 symptomatic and 54 asymptomatic plaques. This study demonstrates the ability of a CAD system based on US image analysis and a hybrid trained NN to identify atheromatous plaques at high risk of stroke.
Resumo:
Current advanced cloud infrastructure management solutions allow scheduling actions for dynamically changing the number of running virtual machines (VMs). This approach, however, does not guarantee that the scheduled number of VMs will properly handle the actual user generated workload, especially if the user utilization patterns will change. We propose using a dynamically generated scaling model for the VMs containing the services of the distributed applications, which is able to react to the variations in the number of application users. We answer the following question: How to dynamically decide how many services of each type are needed in order to handle a larger workload within the same time constraints? We describe a mechanism for dynamically composing the SLAs for controlling the scaling of distributed services by combining data analysis mechanisms with application benchmarking using multiple VM configurations. Based on processing of multiple application benchmarks generated data sets we discover a set of service monitoring metrics able to predict critical Service Level Agreement (SLA) parameters. By combining this set of predictor metrics with a heuristic for selecting the appropriate scaling-out paths for the services of distributed applications, we show how SLA scaling rules can be inferred and then used for controlling the runtime scale-in and scale-out of distributed services. We validate our architecture and models by performing scaling experiments with a distributed application representative for the enterprise class of information systems. We show how dynamically generated SLAs can be successfully used for controlling the management of distributed services scaling.
Resumo:
This article addresses the issue of kriging-based optimization of stochastic simulators. Many of these simulators depend on factors that tune the level of precision of the response, the gain in accuracy being at a price of computational time. The contribution of this work is two-fold: first, we propose a quantile-based criterion for the sequential design of experiments, in the fashion of the classical expected improvement criterion, which allows an elegant treatment of heterogeneous response precisions. Second, we present a procedure for the allocation of the computational time given to each measurement, allowing a better distribution of the computational effort and increased efficiency. Finally, the optimization method is applied to an original application in nuclear criticality safety. This article has supplementary material available online. The proposed criterion is available in the R package DiceOptim.
Resumo:
BACKGROUND Driving a car is a complex instrumental activity of daily living and driving performance is very sensitive to cognitive impairment. The assessment of driving-relevant cognition in older drivers is challenging and requires reliable and valid tests with good sensitivity and specificity to predict safe driving. Driving simulators can be used to test fitness to drive. Several studies have found strong correlation between driving simulator performance and on-the-road driving. However, access to driving simulators is restricted to specialists and simulators are too expensive, large, and complex to allow easy access to older drivers or physicians advising them. An easily accessible, Web-based, cognitive screening test could offer a solution to this problem. The World Wide Web allows easy dissemination of the test software and implementation of the scoring algorithm on a central server, allowing generation of a dynamically growing database with normative values and ensures that all users have access to the same up-to-date normative values. OBJECTIVE In this pilot study, we present the novel Web-based Bern Cognitive Screening Test (wBCST) and investigate whether it can predict poor simulated driving performance in healthy and cognitive-impaired participants. METHODS The wBCST performance and simulated driving performance have been analyzed in 26 healthy younger and 44 healthy older participants as well as in 10 older participants with cognitive impairment. Correlations between the two tests were calculated. Also, simulated driving performance was used to group the participants into good performers (n=70) and poor performers (n=10). A receiver-operating characteristic analysis was calculated to determine sensitivity and specificity of the wBCST in predicting simulated driving performance. RESULTS The mean wBCST score of the participants with poor simulated driving performance was reduced by 52%, compared to participants with good simulated driving performance (P<.001). The area under the receiver-operating characteristic curve was 0.80 with a 95% confidence interval 0.68-0.92. CONCLUSIONS When selecting a 75% test score as the cutoff, the novel test has 83% sensitivity, 70% specificity, and 81% efficiency, which are good values for a screening test. Overall, in this pilot study, the novel Web-based computer test appears to be a promising tool for supporting clinicians in fitness-to-drive assessments of older drivers. The Web-based distribution and scoring on a central computer will facilitate further evaluation of the novel test setup. We expect that in the near future, Web-based computer tests will become a valid and reliable tool for clinicians, for example, when assessing fitness to drive in older drivers.
Resumo:
Internet of Things based systems are anticipated to gain widespread use in industrial applications. Standardization efforts, like 6L0WPAN and the Constrained Application Protocol (CoAP) have made the integration of wireless sensor nodes possible using Internet technology and web-like access to data (RESTful service access). While there are still some open issues, the interoperability problem in the lower layers can now be considered solved from an enterprise software vendors' point of view. One possible next step towards integration of real-world objects into enterprise systems and solving the corresponding interoperability problems at higher levels is to use semantic web technologies. We introduce an abstraction of real-world objects, called Semantic Physical Business Entities (SPBE), using Linked Data principles. We show that this abstraction nicely fits into enterprise systems, as SPBEs allow a business object centric view on real-world objects, instead of a pure device centric view. The interdependencies between how currently services in an enterprise system are used and how this can be done in a semantic real-world aware enterprise system are outlined, arguing for the need of semantic services and semantic knowledge repositories. We introduce a lightweight query language, which we use to perform a quantitative analysis of our approach to demonstrate its feasibility.
Resumo:
Commercially available assays for the simultaneous detection of multiple inflammatory and cardiac markers in porcine blood samples are currently lacking. Therefore, this study was aimed at developing a bead-based, multiplexed flow cytometric assay to simultaneously detect porcine cytokines [interleukin (IL)-1β, IL-6, IL-10, and tumor necrosis factor alpha], chemokines (IL-8 and monocyte chemotactic protein 1), growth factors [basic fibroblast growth factor (bFGF), vascular endothelial growth factor, and platelet-derived growth factor-bb], and injury markers (cardiac troponin-I) as well as complement activation markers (C5a and sC5b-9). The method was based on the Luminex xMAP technology, resulting in the assembly of a 6- and 11-plex from the respective individual singleplex situation. The assay was evaluated for dynamic range, sensitivity, cross-reactivity, intra-assay and interassay variance, spike recovery, and correlation between multiplex and commercially available enzyme-linked immunosorbent assay as well as the respective singleplex. The limit of detection ranged from 2.5 to 30,000 pg/ml for all analytes (6- and 11-plex assays), except for soluble C5b-9 with a detection range of 2-10,000 ng/ml (11-plex). Typically, very low cross-reactivity (<3% and <1.4% by 11- and 6-plex, respectively) between analytes was found. Intra-assay variances ranged from 4.9 to 7.4% (6-plex) and 5.3 to 12.9% (11-plex). Interassay variances for cytokines were between 8.1 and 28.8% (6-plex) and 10.1 and 26.4% (11-plex). Correlation coefficients with singleplex assays for 6-plex as well as for 11-plex were high, ranging from 0.988 to 0.997 and 0.913 to 0.999, respectively. In this study, a bead-based porcine 11-plex and 6-plex assay with a good assay sensitivity, broad dynamic range, and low intra-assay variance and cross-reactivity was established. These assays therefore represent a new, useful tool for the analysis of samples generated from experiments with pigs.
Resumo:
The human face is a vital component of our identity and many people undergo medical aesthetics procedures in order to achieve an ideal or desired look. However, communication between physician and patient is fundamental to understand the patient’s wishes and to achieve the desired results. To date, most plastic surgeons rely on either “free hand” 2D drawings on picture printouts or computerized picture morphing. Alternatively, hardware dependent solutions allow facial shapes to be created and planned in 3D, but they are usually expensive or complex to handle. To offer a simple and hardware independent solution, we propose a web-based application that uses 3 standard 2D pictures to create a 3D representation of the patient’s face on which facial aesthetic procedures such as filling, skin clearing or rejuvenation, and rhinoplasty are planned in 3D. The proposed application couples a set of well-established methods together in a novel manner to optimize 3D reconstructions for clinical use. Face reconstructions performed with the application were evaluated by two plastic surgeons and also compared to ground truth data. Results showed the application can provide accurate 3D face representations to be used in clinics (within an average of 2 mm error) in less than 5 min.