50 resultados para Viterbi-based algorithm

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Le rétinoblastome (Rb) est une tumeur provenant des cellules rétiniennes progénitrices des photorécepteurs. C'est la tumeur pédiatrique maligne la plus fréquente avec une incidence par naissance évaluée entre 1/15Ό00 et 1/20Ό00. Les enfants atteints de Rb sont diagnostiqué dans leur grande majorité avant l'âge de 4 ans, soit le temps nécessaire à la différentiation et à la maturation des photorécepteurs et donc à la disparition de la cellule d'origine du Rb. La survie du patient, la sauvegarde oculaire et le pronostic visuel restent excellents pour autant que le traitement ne soit pas différé. Dans sa variante non héréditaire (60%) le Rb est toujours unilatéral et sporadique. Le Rb héréditaire de transmission dominante autosomique (40%), se décline sous toutes les formes, familiale (10%) ou sporadique (30%), que l'atteinte soit unilatérale ou bilatérale. La majorité des mutations causales sont uniques et distribuées de façon aléatoire sur la totalité du gène RB1 sans région prédisposante. La détection de ces mutations est couteuse et chronophage, tout en présentant un taux de détection relativement bas; surtout dans les cas de Rb sporadiques unilatéraux. Dans le but d'identifier les patients présentant un risque réel de développer un Rb, et de réduire le nombre d'examens sous narcose requis pour le dépistage de la maladie chez les sujets à risque, nous avons développé une stratégie sensible, rapide, efficace et peu couteuse basée sur une analyse de l'haplotype intragénique. Cet algorithme prend en compte a) la perte d'hétérozygotie intratumorale du gène RB1, b) l'origine paternelle préférentielle des nouvelles mutations germinales et c) un risque a priori dérivé des données empiriques de Vogel. Pendant la période allant de janvier 1994 à décembre 2006, nous avons comparé l'apparition de nouveau Rb parmi la fratrie et la descendance de patient atteints au nombre de nouveaux cas attendus calculé par notre algorithme. 134 familles ont été étudiées. L'analyse moléculaire a été effectuée chez 570 personnes dont 99 patients âgés de moins de 4 ans et donc à risque de développer un Rb. Parmi cette cohorte, nous avons observé l'apparition d'un cas de Rb, alors que les risques cumulés a posteriori calculé par notre algorithme prédisait l'apparition de 1.77 nouveau cas. Dans cette étude, nous avons pu valider notre algorithme prédisant la récurrence de Rb chez les parents de 1er degré de patients atteints. Cet outil devrait grandement faciliter le conseil génétique ainsi que le suivi des patients à risque de développer un Rb, surtout dans les cas ou le séquençage direct du gène RB1 n'est pas disponible ou est resté non informatif. - Purpose: Most RBI mutations are unique and distributed throughout the RBI gene. Their detection can be time-consuming and the yield especially low in cases of conservatively-treated sporadic unilateral retinoblas-toma (Rb) patients. In order to identify patients with true risk of developing Rb, and to reduce the number of unnecessary examinations under anesthesia in all other cases, we developed a universal sensitive, efficient and cost-effective strategy based on intragenic haplotype analysis. Methods: This algorithm allows the calculation of the a posteriori risk of developing Rb and takes into account (a) RBI loss of heterozygosity in tumors, (b) preferential paternal origin of new germline mutations, (c) a priori risk derived from empirical data by Vogel, and (d) disease penetrance of 90% in most cases. We report the occurrence of Rb in first degree relatives of patients with sporadic Rb who visited the Jules Gonin Eye Hospital, Lausanne, Switzerland, from January 1994 to December 2006 compared to expected new cases of Rb using our algorithm. Results: A total of 134 families with sporadic Rb were enrolled; testing was performed in 570 individuals and 99 patients younger than 4 years old were identified. We observed one new case of Rb. Using our algorithm, the cumulated total a posteriori risk of recurrence was 1.77. Conclusions: This is the first time that linkage analysis has been validated to monitor the risk of recurrence in sporadic Rb. This should be a useful tool in genetic counseling, especially when direct RBI screening for mutations leaves a negative result or is unavailable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Most RB1 mutations are unique and distributed throughout the RB1 gene. Their detection can be time-consuming and the yield especially low in cases of conservatively-treated sporadic unilateral retinoblastoma (Rb) patients. In order to identify patients with true risk of developing Rb, and to reduce the number of unnecessary examinations under anesthesia in all other cases, we developed a universal sensitive, efficient and cost-effective strategy based on intragenic haplotype analysis. METHODS: This algorithm allows the calculation of the a posteriori risk of developing Rb and takes into account (a) RB1 loss of heterozygosity in tumors, (b) preferential paternal origin of new germline mutations, (c) a priori risk derived from empirical data by Vogel, and (d) disease penetrance of 90% in most cases. We report the occurrence of Rb in first degree relatives of patients with sporadic Rb who visited the Jules Gonin Eye Hospital, Lausanne, Switzerland, from January 1994 to December 2006 compared to expected new cases of Rb using our algorithm. RESULTS: A total of 134 families with sporadic Rb were enrolled; testing was performed in 570 individuals and 99 patients younger than 4 years old were identified. We observed one new case of Rb. Using our algorithm, the cumulated total a posteriori risk of recurrence was 1.77. CONCLUSIONS: This is the first time that linkage analysis has been validated to monitor the risk of recurrence in sporadic Rb. This should be a useful tool in genetic counseling, especially when direct RB1 screening for mutations leaves a negative result or is unavailable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE: To assess how different diagnostic decision aids perform in terms of sensitivity, specificity, and harm. METHODS: Four diagnostic decision aids were compared, as applied to a simulated patient population: a findings-based algorithm following a linear or branched pathway, a serial threshold-based strategy, and a parallel threshold-based strategy. Headache in immune-compromised HIV patients in a developing country was used as an example. Diagnoses included cryptococcal meningitis, cerebral toxoplasmosis, tuberculous meningitis, bacterial meningitis, and malaria. Data were derived from literature and expert opinion. Diagnostic strategies' validity was assessed in terms of sensitivity, specificity, and harm related to mortality and morbidity. Sensitivity analyses and Monte Carlo simulation were performed. RESULTS: The parallel threshold-based approach led to a sensitivity of 92% and a specificity of 65%. Sensitivities of the serial threshold-based approach and the branched and linear algorithms were 47%, 47%, and 74%, respectively, and the specificities were 85%, 95%, and 96%. The parallel threshold-based approach resulted in the least harm, with the serial threshold-based approach, the branched algorithm, and the linear algorithm being associated with 1.56-, 1.44-, and 1.17-times higher harm, respectively. Findings were corroborated by sensitivity and Monte Carlo analyses. CONCLUSION: A threshold-based diagnostic approach is designed to find the optimal trade-off that minimizes expected harm, enhancing sensitivity and lowering specificity when appropriate, as in the given example of a symptom pointing to several life-threatening diseases. Findings-based algorithms, in contrast, solely consider clinical observations. A parallel workup, as opposed to a serial workup, additionally allows for all potential diseases to be reviewed, further reducing false negatives. The parallel threshold-based approach might, however, not be as good in other disease settings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The analysis of rockfall characteristics and spatial distribution is fundamental to understand and model the main factors that predispose to failure. In our study we analysed LiDAR point clouds aiming to: (1) detect and characterise single rockfalls; (2) investigate their spatial distribution. To this end, different cluster algorithms were applied: 1a) Nearest Neighbour Clutter Removal (NNCR) in combination with the Expectation?Maximization (EM) in order to separate feature points from clutter; 1b) a density based algorithm (DBSCAN) was applied to isolate the single clusters (i.e. the rockfall events); 2) finally we computed the Ripley's K-function to investigate the global spatial pattern of the extracted rockfalls. The method allowed proper identification and characterization of more than 600 rockfalls occurred on a cliff located in Puigcercos (Catalonia, Spain) during a time span of six months. The spatial distribution of these events proved that rockfall were clustered distributed at a welldefined distance-range. Computations were carried out using R free software for statistical computing and graphics. The understanding of the spatial distribution of precursory rockfalls may shed light on the forecasting of future failures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: This study describes the prevalence, associated anomalies, and demographic characteristics of cases of multiple congenital anomalies (MCA) in 19 population-based European registries (EUROCAT) covering 959,446 births in 2004 and 2010. METHODS: EUROCAT implemented a computer algorithm for classification of congenital anomaly cases followed by manual review of potential MCA cases by geneticists. MCA cases are defined as cases with two or more major anomalies of different organ systems, excluding sequences, chromosomal and monogenic syndromes. RESULTS: The combination of an epidemiological and clinical approach for classification of cases has improved the quality and accuracy of the MCA data. Total prevalence of MCA cases was 15.8 per 10,000 births. Fetal deaths and termination of pregnancy were significantly more frequent in MCA cases compared with isolated cases (p < 0.001) and MCA cases were more frequently prenatally diagnosed (p < 0.001). Live born infants with MCA were more often born preterm (p < 0.01) and with birth weight < 2500 grams (p < 0.01). Respiratory and ear, face, and neck anomalies were the most likely to occur with other anomalies (34% and 32%) and congenital heart defects and limb anomalies were the least likely to occur with other anomalies (13%) (p < 0.01). However, due to their high prevalence, congenital heart defects were present in half of all MCA cases. Among males with MCA, the frequency of genital anomalies was significantly greater than the frequency of genital anomalies among females with MCA (p < 0.001). CONCLUSION: Although rare, MCA cases are an important public health issue, because of their severity. The EUROCAT database of MCA cases will allow future investigation on the epidemiology of these conditions and related clinical and diagnostic problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the present work was assess the feasibility of using a pseudo-inverse and null-space optimization approach in the modeling of the shoulder biomechanics. The method was applied to a simplified musculoskeletal shoulder model. The mechanical system consisted in the arm, and the external forces were the arm weight, 6 scapulo-humeral muscles and the reaction at the glenohumeral joint, which was considered as a spherical joint. The muscle wrapping was considered around the humeral head assumed spherical. The dynamical equations were solved in a Lagrangian approach. The mathematical redundancy of the mechanical system was solved in two steps: a pseudo-inverse optimization to minimize the square of the muscle stress and a null-space optimization to restrict the muscle force to physiological limits. Several movements were simulated. The mathematical and numerical aspects of the constrained redundancy problem were efficiently solved by the proposed method. The prediction of muscle moment arms was consistent with cadaveric measurements and the joint reaction force was consistent with in vivo measurements. This preliminary work demonstrated that the developed algorithm has a great potential for more complex musculoskeletal modeling of the shoulder joint. In particular it could be further applied to a non-spherical joint model, allowing for the natural translation of the humeral head in the glenoid fossa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The implicit projection algorithm of isotropic plasticity is extended to an objective anisotropic elastic perfectly plastic model. The recursion formula developed to project the trial stress on the yield surface, is applicable to any non linear elastic law and any plastic yield function.A curvilinear transverse isotropic model based on a quadratic elastic potential and on Hill's quadratic yield criterion is then developed and implemented in a computer program for bone mechanics perspectives. The paper concludes with a numerical study of a schematic bone-prosthesis system to illustrate the potential of the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultrasound segmentation is a challenging problem due to the inherent speckle and some artifacts like shadows, attenuation and signal dropout. Existing methods need to include strong priors like shape priors or analytical intensity models to succeed in the segmentation. However, such priors tend to limit these methods to a specific target or imaging settings, and they are not always applicable to pathological cases. This work introduces a semi-supervised segmentation framework for ultrasound imaging that alleviates the limitation of fully automatic segmentation, that is, it is applicable to any kind of target and imaging settings. Our methodology uses a graph of image patches to represent the ultrasound image and user-assisted initialization with labels, which acts as soft priors. The segmentation problem is formulated as a continuous minimum cut problem and solved with an efficient optimization algorithm. We validate our segmentation framework on clinical ultrasound imaging (prostate, fetus, and tumors of the liver and eye). We obtain high similarity agreement with the ground truth provided by medical expert delineations in all applications (94% DICE values in average) and the proposed algorithm performs favorably with the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the very large amount of data obtained everyday through population surveys, much of the new research again could use this information instead of collecting new samples. Unfortunately, relevant data are often disseminated into different files obtained through different sampling designs. Data fusion is a set of methods used to combine information from different sources into a single dataset. In this article, we are interested in a specific problem: the fusion of two data files, one of which being quite small. We propose a model-based procedure combining a logistic regression with an Expectation-Maximization algorithm. Results show that despite the lack of data, this procedure can perform better than standard matching procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Auditory evoked potentials are informative of intact cortical functions of comatose patients. The integrity of auditory functions evaluated using mismatch negativity paradigms has been associated with their chances of survival. However, because auditory discrimination is assessed at various delays after coma onset, it is still unclear whether this impairment depends on the time of the recording. We hypothesized that impairment in auditory discrimination capabilities is indicative of coma progression, rather than of the comatose state itself and that rudimentary auditory discrimination remains intact during acute stages of coma. We studied 30 post-anoxic comatose patients resuscitated from cardiac arrest and five healthy, age-matched controls. Using a mismatch negativity paradigm, we performed two electroencephalography recordings with a standard 19-channel clinical montage: the first within 24 h after coma onset and under mild therapeutic hypothermia, and the second after 1 day and under normothermic conditions. We analysed electroencephalography responses based on a multivariate decoding algorithm that automatically quantifies neural discrimination at the single patient level. Results showed high average decoding accuracy in discriminating sounds both for control subjects and comatose patients. Importantly, accurate decoding was largely independent of patients' chance of survival. However, the progression of auditory discrimination between the first and second recordings was informative of a patient's chance of survival. A deterioration of auditory discrimination was observed in all non-survivors (equivalent to 100% positive predictive value for survivors). We show, for the first time, evidence of intact auditory processing even in comatose patients who do not survive and that progression of sound discrimination over time is informative of a patient's chance of survival. Tracking auditory discrimination in comatose patients could provide new insight to the chance of awakening in a quantitative and automatic fashion during early stages of coma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-resolution tomographic imaging of the shallow subsurface is becoming increasingly important for a wide range of environmental, hydrological and engineering applications. Because of their superior resolution power, their sensitivity to pertinent petrophysical parameters, and their far reaching complementarities, both seismic and georadar crosshole imaging are of particular importance. To date, corresponding approaches have largely relied on asymptotic, ray-based approaches, which only account for a very small part of the observed wavefields, inherently suffer from a limited resolution, and in complex environments may prove to be inadequate. These problems can potentially be alleviated through waveform inversion. We have developed an acoustic waveform inversion approach for crosshole seismic data whose kernel is based on a finite-difference time-domain (FDTD) solution of the 2-D acoustic wave equations. This algorithm is tested on and applied to synthetic data from seismic velocity models of increasing complexity and realism and the results are compared to those obtained using state-of-the-art ray-based traveltime tomography. Regardless of the heterogeneity of the underlying models, the waveform inversion approach has the potential of reliably resolving both the geometry and the acoustic properties of features of the size of less than half a dominant wavelength. Our results do, however, also indicate that, within their inherent resolution limits, ray-based approaches provide an effective and efficient means to obtain satisfactory tomographic reconstructions of the seismic velocity structure in the presence of mild to moderate heterogeneity and in absence of strong scattering. Conversely, the excess effort of waveform inversion provides the greatest benefits for the most heterogeneous, and arguably most realistic, environments where multiple scattering effects tend to be prevalent and ray-based methods lose most of their effectiveness.