166 resultados para clustering techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Difficult tracheal intubation remains a constant and significant source of morbidity and mortality in anaesthetic practice. Insufficient airway assessment in the preoperative period continues to be a major cause of unanticipated difficult intubation. Although many risk factors have already been identified, preoperative airway evaluation is not always regarded as a standard procedure and the respective weight of each risk factor remains unclear. Moreover the predictive scores available are not sensitive, moderately specific and often operator-dependant. In order to improve the preoperative detection of patients at risk for difficult intubation, we developed a system for automated and objective evaluation of morphologic criteria of the face and neck using video recordings and advanced techniques borrowed from face recognition. Method and results: Frontal video sequences were recorded in 5 healthy volunteers. During the video recording, subjects were requested to perform maximal flexion-extension of the neck and to open wide the mouth with tongue pulled out. A robust and real-time face tracking system was then applied, allowing to automatically identify and map a grid of 55 control points on the face, which were tracked during head motion. These points located important features of the face, such as the eyebrows, the nose, the contours of the eyes and mouth, and the external contours, including the chin. Moreover, based on this face tracking, the orientation of the head could also be estimated at each frame of the video sequence. Thus, we could infer for each frame the pitch angle of the head pose (related to the vertical rotation of the head) and obtain the degree of head extension. Morphological criteria used in the most frequent cited predictive scores were also extracted, such as mouth opening, degree of visibility of the uvula or thyreo-mental distance. Discussion and conclusion: Preliminary results suggest the high feasibility of the technique. The next step will be the application of the same automated and objective evaluation to patients who will undergo tracheal intubation. The difficulties related to intubation will be then correlated to the biometric characteristics of the patients. The objective in mind is to analyze the biometrics data with artificial intelligence algorithms to build a highly sensitive and specific predictive test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tension-band wiring is a recognised standard treatment for fixation of olecranon fractures. The classical operation technique is well known and widespread among the orthopaedic surgeons. Nevertheless complications like K-wire migration or skin perforation and difficult technical as well as anatomical prerequisites require better-adapted operation fixation methods. In older female patients a cut through of the Kirschner wires with concomitant secondary displacement was observed. We intent to develop a new, better adapted operation technique for olecranon fractures in the old patients, in order to decrease complications and follow-up procedures. In this study we compare two different K-wire positions: 10 models of the classical AO tension-banding to 10 models with adapted K-wire insertion. In this group the K-wire passes from the tip of the olecranon to the posterior cortical of the distal fragment of the ulna. We tested maximal failure load, maximal opening angle as well as maximal work to achieve maximal force. In either technique we were able to determine different variables: a maximal failure load of more than 600N (p = 0.94) for both fixation methods and a maximal opening angle for both techniques of about 10° (p = 0.86). To achieve the maximal force our modified technique required a slightly increased work (p = 0.16). In this study no statistical significant differences between the two fixation techniques was shown. This leads to the conclusion that the modified version is comparable to the classical operation technique considering the stability, but due to the adaption of the angle in the modified procedure, less lesions of neurovascular structures on the volar side can be expected. To support our findings cadaver studies are needed for further investigations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: This review describes and evaluates the results of laparoscopic aortic surgery. METHODS: We describe the different laparoscopic techniques used to treat aortic disease, including (1) total laparoscopic aortic surgery (TLS), (2) laparoscopy-assisted procedures including hand-assisted laparoscopic surgery (HALS), and (3) robot-assisted laparoscopic surgery, with their current indications. Results of these techniques are analyzed in a systematic review of the clinical series published between 1998 and 2008, each containing >10 patients with complete information concerning operative time, clamping time, conversion rate, length of hospital stay, morbidity, and mortality. RESULTS: We selected and reviewed 29 studies that included 1073 patients. Heterogeneity of the studies and selection of the patients made comparison with current open or endovascular surgery difficult. Median operative time varied widely in TLS, from 240 to 391 minutes. HALS had the shortest operating time. Median clamping time varied from 60 to 146 minutes in TLS and was shorter in HALS. Median hospital stay varied from 4 to 10 days regardless of the laparoscopic technique. The postoperative mortality rate was 2.1% (95% confidence interval, 1.4-3.0), with no significant difference between patients treated for occlusive disease or for aneurysmal disease. Conversion to open surgery was necessary in 8.1% of patients and was slightly higher with TLS than with laparoscopy-assisted techniques (P = .07). CONCLUSIONS: Analysis of these series shows that laparoscopic aortic surgery can be performed safely provided that patient selection is adjusted to the surgeon's experience and conversion is liberally performed. The future of this technique in comparison with endovascular surgery is still unknown, and it is now time for multicenter randomized trials to demonstrate the potential benefit of this type of surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In groundwater applications, Monte Carlo methods are employed to model the uncertainty on geological parameters. However, their brute-force application becomes computationally prohibitive for highly detailed geological descriptions, complex physical processes, and a large number of realizations. The Distance Kernel Method (DKM) overcomes this issue by clustering the realizations in a multidimensional space based on the flow responses obtained by means of an approximate (computationally cheaper) model; then, the uncertainty is estimated from the exact responses that are computed only for one representative realization per cluster (the medoid). Usually, DKM is employed to decrease the size of the sample of realizations that are considered to estimate the uncertainty. We propose to use the information from the approximate responses for uncertainty quantification. The subset of exact solutions provided by DKM is then employed to construct an error model and correct the potential bias of the approximate model. Two error models are devised that both employ the difference between approximate and exact medoid solutions, but differ in the way medoid errors are interpolated to correct the whole set of realizations. The Local Error Model rests upon the clustering defined by DKM and can be seen as a natural way to account for intra-cluster variability; the Global Error Model employs a linear interpolation of all medoid errors regardless of the cluster to which the single realization belongs. These error models are evaluated for an idealized pollution problem in which the uncertainty of the breakthrough curve needs to be estimated. For this numerical test case, we demonstrate that the error models improve the uncertainty quantification provided by the DKM algorithm and are effective in correcting the bias of the estimate computed solely from the MsFV results. The framework presented here is not specific to the methods considered and can be applied to other combinations of approximate models and techniques to select a subset of realizations

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The proportion of population living in or around cites is more important than ever. Urban sprawl and car dependence have taken over the pedestrian-friendly compact city. Environmental problems like air pollution, land waste or noise, and health problems are the result of this still continuing process. The urban planners have to find solutions to these complex problems, and at the same time insure the economic performance of the city and its surroundings. At the same time, an increasing quantity of socio-economic and environmental data is acquired. In order to get a better understanding of the processes and phenomena taking place in the complex urban environment, these data should be analysed. Numerous methods for modelling and simulating such a system exist and are still under development and can be exploited by the urban geographers for improving our understanding of the urban metabolism. Modern and innovative visualisation techniques help in communicating the results of such models and simulations. This thesis covers several methods for analysis, modelling, simulation and visualisation of problems related to urban geography. The analysis of high dimensional socio-economic data using artificial neural network techniques, especially self-organising maps, is showed using two examples at different scales. The problem of spatiotemporal modelling and data representation is treated and some possible solutions are shown. The simulation of urban dynamics and more specifically the traffic due to commuting to work is illustrated using multi-agent micro-simulation techniques. A section on visualisation methods presents cartograms for transforming the geographic space into a feature space, and the distance circle map, a centre-based map representation particularly useful for urban agglomerations. Some issues on the importance of scale in urban analysis and clustering of urban phenomena are exposed. A new approach on how to define urban areas at different scales is developed, and the link with percolation theory established. Fractal statistics, especially the lacunarity measure, and scale laws are used for characterising urban clusters. In a last section, the population evolution is modelled using a model close to the well-established gravity model. The work covers quite a wide range of methods useful in urban geography. Methods should still be developed further and at the same time find their way into the daily work and decision process of urban planners. La part de personnes vivant dans une région urbaine est plus élevé que jamais et continue à croître. L'étalement urbain et la dépendance automobile ont supplanté la ville compacte adaptée aux piétons. La pollution de l'air, le gaspillage du sol, le bruit, et des problèmes de santé pour les habitants en sont la conséquence. Les urbanistes doivent trouver, ensemble avec toute la société, des solutions à ces problèmes complexes. En même temps, il faut assurer la performance économique de la ville et de sa région. Actuellement, une quantité grandissante de données socio-économiques et environnementales est récoltée. Pour mieux comprendre les processus et phénomènes du système complexe "ville", ces données doivent être traitées et analysées. Des nombreuses méthodes pour modéliser et simuler un tel système existent et sont continuellement en développement. Elles peuvent être exploitées par le géographe urbain pour améliorer sa connaissance du métabolisme urbain. Des techniques modernes et innovatrices de visualisation aident dans la communication des résultats de tels modèles et simulations. Cette thèse décrit plusieurs méthodes permettant d'analyser, de modéliser, de simuler et de visualiser des phénomènes urbains. L'analyse de données socio-économiques à très haute dimension à l'aide de réseaux de neurones artificiels, notamment des cartes auto-organisatrices, est montré à travers deux exemples aux échelles différentes. Le problème de modélisation spatio-temporelle et de représentation des données est discuté et quelques ébauches de solutions esquissées. La simulation de la dynamique urbaine, et plus spécifiquement du trafic automobile engendré par les pendulaires est illustrée à l'aide d'une simulation multi-agents. Une section sur les méthodes de visualisation montre des cartes en anamorphoses permettant de transformer l'espace géographique en espace fonctionnel. Un autre type de carte, les cartes circulaires, est présenté. Ce type de carte est particulièrement utile pour les agglomérations urbaines. Quelques questions liées à l'importance de l'échelle dans l'analyse urbaine sont également discutées. Une nouvelle approche pour définir des clusters urbains à des échelles différentes est développée, et le lien avec la théorie de la percolation est établi. Des statistiques fractales, notamment la lacunarité, sont utilisées pour caractériser ces clusters urbains. L'évolution de la population est modélisée à l'aide d'un modèle proche du modèle gravitaire bien connu. Le travail couvre une large panoplie de méthodes utiles en géographie urbaine. Toutefois, il est toujours nécessaire de développer plus loin ces méthodes et en même temps, elles doivent trouver leur chemin dans la vie quotidienne des urbanistes et planificateurs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A recurring task in the analysis of mass genome annotation data from high-throughput technologies is the identification of peaks or clusters in a noisy signal profile. Examples of such applications are the definition of promoters on the basis of transcription start site profiles, the mapping of transcription factor binding sites based on ChIP-chip data and the identification of quantitative trait loci (QTL) from whole genome SNP profiles. Input to such an analysis is a set of genome coordinates associated with counts or intensities. The output consists of a discrete number of peaks with respective volumes, extensions and center positions. We have developed for this purpose a flexible one-dimensional clustering tool, called MADAP, which we make available as a web server and as standalone program. A set of parameters enables the user to customize the procedure to a specific problem. The web server, which returns results in textual and graphical form, is useful for small to medium-scale applications, as well as for evaluation and parameter tuning in view of large-scale applications, requiring a local installation. The program written in C++ can be freely downloaded from ftp://ftp.epd.unil.ch/pub/software/unix/madap. The MADAP web server can be accessed at http://www.isrec.isb-sib.ch/madap/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The laparoscopic approach has emerged as a valid option for surgical management of kidney cancer, as well as a few benign pathologies. The immediate benefits of laparoscopy are well established and include less estimated blood loss, decreased pain, shorter perioperative convalescence, and improved cosmesis. Long-term oncologic outcomes of patients treated laparoscopically for kidney tumors are similar to those of open surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Principles: Surgeon's experience is crucial for proper application of sentinel node biopsy (SNB) in patients with breast cancer. A 20-30 cases learning curve of sentinel node (SN) and axillary lymph node dissection (ALND) was widely practiced. In order to speed up this learning curve, surgeons may be trained intraoperative by an experienced surgeon. The purpose of this report is to evaluate the results of this procedure. Methods: Patients with one primary invasive breast cancer (cT1-T2[<3 cm]cN0) underwent SNB based on lymphoscintigraphy using technetium Tc 99m colloid, intraoperative gamma probe detection, with or without blue dye mapping. This was followed by completion ALND when SN was positive or not found. SNB was performed by one experienced surgeon (teacher) or by 10 junior surgeons trained by the experienced surgeon (trainees). Four groups were defined: (i) SNB with immediate ALND for the teacher's learning curve, (ii) SNB by the teacher, (iii) SNB by the trainees under the teacher's supervision, and (iv) SNB by the trainees alone. Results: Between May 1999 and December 2007, a total of 808 évaluable patients underwent SNB. The SN identification rate was 98% in the teacher's group, and 99% in the trainees' group (p = 0.196). SN were positive in respectively 28% and 29% of patients (p = 0.196). The distribution of isolated tumor cells, micrometastases and metastases was not statistically different between the teacher's and the trainees' groups (p = 0.163). Conclusion: These comparable results confirm the success with which the SNB was taught. This strategy avoided the 20-30 SNB followed by immediate ALND early required per surgeon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To objectively characterize different heart tissues from functional and viability images provided by composite-strain-encoding (C-SENC) MRI. MATERIALS AND METHODS: C-SENC is a new MRI technique for simultaneously acquiring cardiac functional and viability images. In this work, an unsupervised multi-stage fuzzy clustering method is proposed to identify different heart tissues in the C-SENC images. The method is based on sequential application of the fuzzy c-means (FCM) and iterative self-organizing data (ISODATA) clustering algorithms. The proposed method is tested on simulated heart images and on images from nine patients with and without myocardial infarction (MI). The resulting clustered images are compared with MRI delayed-enhancement (DE) viability images for determining MI. Also, Bland-Altman analysis is conducted between the two methods. RESULTS: Normal myocardium, infarcted myocardium, and blood are correctly identified using the proposed method. The clustered images correctly identified 90 +/- 4% of the pixels defined as infarct in the DE images. In addition, 89 +/- 5% of the pixels defined as infarct in the clustered images were also defined as infarct in DE images. The Bland-Altman results show no bias between the two methods in identifying MI. CONCLUSION: The proposed technique allows for objectively identifying divergent heart tissues, which would be potentially important for clinical decision-making in patients with MI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

MOTIVATION: Analysis of millions of pyro-sequences is currently playing a crucial role in the advance of environmental microbiology. Taxonomy-independent, i.e. unsupervised, clustering of these sequences is essential for the definition of Operational Taxonomic Units. For this application, reproducibility and robustness should be the most sought after qualities, but have thus far largely been overlooked. RESULTS: More than 1 million hyper-variable internal transcribed spacer 1 (ITS1) sequences of fungal origin have been analyzed. The ITS1 sequences were first properly extracted from 454 reads using generalized profiles. Then, otupipe, cd-hit-454, ESPRIT-Tree and DBC454, a new algorithm presented here, were used to analyze the sequences. A numerical assay was developed to measure the reproducibility and robustness of these algorithms. DBC454 was the most robust, closely followed by ESPRIT-Tree. DBC454 features density-based hierarchical clustering, which complements the other methods by providing insights into the structure of the data. AVAILABILITY: An executable is freely available for non-commercial users at ftp://ftp.vital-it.ch/tools/dbc454. It is designed to run under MPI on a cluster of 64-bit Linux machines running Red Hat 4.x, or on a multi-core OSX system. CONTACT: dbc454@vital-it.ch or nicolas.guex@isb-sib.ch.