920 resultados para nonparametric inference


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Gastroesophageal reflux and progressive esophageal dilatation can develop after gastric banding (GB). HYPOTHESIS: Gastric banding may interfere with esophageal motility, enhance reflux, or promote esophageal dilatation. DESIGN: Before-after trial in patients undergoing GB. SETTING: University teaching hospital. PATIENTS AND METHODS: Between January 1999 and August 2002, 43 patients undergoing laparoscopic GB for morbid obesity underwent upper gastrointestinal endoscopy, 24-hour pH monitoring, and stationary esophageal manometry before GB and between 6 and 18 months postoperatively. MAIN OUTCOME MEASURES: Reflux symptoms, endoscopic esophagitis, pressures measured at manometry, esophageal acid exposure. RESULTS: There was no difference in the prevalence of reflux symptoms or esophagitis before and after GB. The lower esophageal sphincter was unaffected by surgery, but contractions in the lower esophagus weakened after GB, in correlation with preoperative values. There was a trend toward more postoperative nonspecific motility disorders. Esophageal acid exposure tended to decrease after GB, with fewer reflux episodes. A few patients developed massive postoperative reflux. There was no clear correlation between preoperative testing and postoperative esophageal acid exposure, although patients with abnormal preoperative acid exposure tended to maintain high values after GB. CONCLUSIONS: Postoperative esophageal dysmotility and gastroesophageal reflux are not uncommon after GB. Preoperative testing should be done routinely. Low amplitude of contraction in the lower esophagus and increased esophageal acid exposure should be regarded as contraindications to GB. Patients with such findings should be offered an alternative procedure, such as Roux-en-Y gastric bypass.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Many factors affect survival in haemodialysis (HD) patients. Our aim was to study whether quality of clinical care may affect survival in this population, when adjusted for demographic characteristics and co-morbidities. METHODS: We studied survival in 553 patients treated by chronic HD during March 2001 in 21 dialysis facilities in western Switzerland. Indicators of quality of care were established for anaemia control, calcium and phosphate product, serum albumin, pre-dialysis blood pressure (BP), type of vascular access and dialysis adequacy (spKt/V) and their baseline values were related to 3-year survival. The modified Charlson co-morbidity index (including age) and transplantation status were also considered as a predictor of survival. RESULTS: Three-year survival was obtained for 96% of the patients; 39% (211/541) of these patients had died. The 3-year survival was 50, 62 and 69%, respectively, in patients who had 0-2, 3 and >or=4 fulfilled indicators of quality of care (test for linear trend, P < 0.001). In a Cox multivariate analysis model, the absence of transplantation, a higher modified Charlson's score, decreased fulfilment of indicators of good clinical care and low pre-dialysis systolic BP were independent predictors of death. CONCLUSION: Good clinical care improves survival in HD patients, even after adjustment for availability of transplantation and co-morbidities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensic science denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstrated its potential to distinguish chemically identical compounds coming from different sources. Despite the numerous applications of IRMS to a wide range of forensic materials, its implementation in a forensic framework is less straightforward than it appears. In addition, each laboratory has developed its own strategy of analysis on calibration, sequence design, standards utilisation and data treatment without a clear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose a methodological framework of the whole process using IRMS methods. We emphasize the importance of considering isotopic results as part of a whole approach, when applying this technology to a particular forensic issue. The process is divided into six different steps, which should be considered for a thoughtful and relevant application. The dissection of this process into fundamental steps, further detailed, enables a better understanding of the essential, though not exhaustive, factors that have to be considered in order to obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratory comparisons.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research considers the problem of spatial data classification using machine learning algorithms: probabilistic neural networks (PNN) and support vector machines (SVM). As a benchmark model simple k-nearest neighbor algorithm is considered. PNN is a neural network reformulation of well known nonparametric principles of probability density modeling using kernel density estimator and Bayesian optimal or maximum a posteriori decision rules. PNN is well suited to problems where not only predictions but also quantification of accuracy and integration of prior information are necessary. An important property of PNN is that they can be easily used in decision support systems dealing with problems of automatic classification. Support vector machine is an implementation of the principles of statistical learning theory for the classification tasks. Recently they were successfully applied for different environmental topics: classification of soil types and hydro-geological units, optimization of monitoring networks, susceptibility mapping of natural hazards. In the present paper both simulated and real data case studies (low and high dimensional) are considered. The main attention is paid to the detection and learning of spatial patterns by the algorithms applied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The majority of Haemosporida species infect birds or reptiles, but many important genera, including Plasmodium, infect mammals. Dipteran vectors shared by avian, reptilian and mammalian Haemosporida, suggest multiple invasions of Mammalia during haemosporidian evolution; yet, phylogenetic analyses have detected only a single invasion event. Until now, several important mammal-infecting genera have been absent in these analyses. This study focuses on the evolutionary origin of Polychromophilus, a unique malaria genus that only infects bats (Microchiroptera) and is transmitted by bat flies (Nycteribiidae). METHODS: Two species of Polychromophilus were obtained from wild bats caught in Switzerland. These were molecularly characterized using four genes (asl, clpc, coI, cytb) from the three different genomes (nucleus, apicoplast, mitochondrion). These data were then combined with data of 60 taxa of Haemosporida available in GenBank. Bayesian inference, maximum likelihood and a range of rooting methods were used to test specific hypotheses concerning the phylogenetic relationships between Polychromophilus and the other haemosporidian genera. RESULTS: The Polychromophilus melanipherus and Polychromophilus murinus samples show genetically distinct patterns and group according to species. The Bayesian tree topology suggests that the monophyletic clade of Polychromophilus falls within the avian/saurian clade of Plasmodium and directed hypothesis testing confirms the Plasmodium origin. CONCLUSION: Polychromophilus' ancestor was most likely a bird- or reptile-infecting Plasmodium before it switched to bats. The invasion of mammals as hosts has, therefore, not been a unique event in the evolutionary history of Haemosporida, despite the suspected costs of adapting to a new host. This was, moreover, accompanied by a switch in dipteran host.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Plasmid DNA and adenovirus vectors currently used in cardiovascular gene therapy trials are limited by low efficiency and short-lived transgene expression, respectively. Recombinant adeno-associated virus (AAV) has recently emerged as an attractive vector for cardiovascular gene therapy. In the present study, we have compared AAV and adenovirus vectors with respect to gene transfer efficiency and the duration of transgene expression in mouse hearts and arteries in vivo. AAV vectors (titer: 5 x 10(8) transducing units (TU)/ml) and adenovirus vectors (1.2 x 10(10) TU/ml) expressing a green fluorescent protein (EGFP) gene were injected either intramyocardially (n=32) or intrapericardially (n=3) in CD-1 mice. Hearts were harvested at varying time intervals (3 days to 1 year) after gene delivery. After intramyocardial injection of 5 microl virus stock solution, cardiomyocyte transduction rates with AAV vectors were 4-fold lower than with adenovirus vectors (1.5% (range: 0.5-2.6%) vs. 6.2% (range: 2.7-13.7%); P<0.05), but similar to titer-matched adenovirus vectors (0.7%; range: 0.2-1.2%). AAV-mediated EGFP expression lasted for at least 1 year. AAV vectors instilled into the pericardial space transduced epicardial myocytes. Arterial gene transfer was studied in mouse carotids (n=26). Both vectors selectively transduced endothelial cells after luminal instillation. Transduction rates with AAV vectors were 8-fold lower than with adenovirus vectors (2.0% (range: 0-3.2%) vs. 16.2% (range: 8.5-20.2%); P<0.05). Prolonged EGFP expression was observed after AAV but not adenovirus-mediated gene transfer. In conclusion, AAV vectors deliver and express genes for extended periods of time in the myocardium and arterial endothelium in vivo. AAV vectors may be useful for gene therapy approaches to chronic cardiovascular diseases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Outcome following foot and ankle surgery can be assessed by disease- and region-specific scores. Many scoring systems exist, making comparison among studies difficult. The present study focused on outcome measures for a common foot and ankle abnormality and compared the results obtained by 2 disease-specific and 2 body region-specific scores. METHODS: We reviewed 41 patients who underwent lateral ankle ligament reconstruction. Four outcome scales were administered simultaneously: the Cumberland Ankle Instability Tool (CAIT) and the Chronic Ankle Instability Scale (CAIS), which are disease specific, and the American Orthopedic Foot & Ankle Society (AOFAS) hindfoot scale and the Foot and Ankle Ability Measure (FAAM), which are both body region-specific. The degree of correlation between scores was assessed by Pearson's correlation coefficient. Nonparametric tests, the Kruskal-Wallis and the Mann-Whitney test for pairwise comparison of the scores, were performed. RESULTS: A significant difference (P < .005) was observed between the CAIS and the AOFAS score (P = .0002), between the CAIS and the FAAM 1 (P = .0001), and between the CAIT and the AOFAS score (P = .0003). CONCLUSIONS: This study compared the performances of 4 disease- and body region-specific scoring systems. We demonstrated a correlation between the 4 administered scoring systems and notable differences between the results given by each of them. Disease-specific scores appeared more accurate than body region-specific scores. A strong correlation between the AOFAS score and the other scales was observed. The FAAM seemed a good compromise because it offered the possibility to evaluate the patient according to his or her own functional demand. CLINICAL RELEVANCE: The present study contributes to the development of more critical and accurate outcome assesment methods in foot and ankle surgery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports on the purpose, design, methodology and target audience of E-learning courses in forensic interpretation offered by the authors since 2010, including practical experiences made throughout the implementation period of this project. This initiative was motivated by the fact that reporting results of forensic examinations in a logically correct and scientifically rigorous way is a daily challenge for any forensic practitioner. Indeed, interpretation of raw data and communication of findings in both written and oral statements are topics where knowledge and applied skills are needed. Although most forensic scientists hold educational records in traditional sciences, only few actually followed full courses that focussed on interpretation issues. Such courses should include foundational principles and methodology - including elements of forensic statistics - for the evaluation of forensic data in a way that is tailored to meet the needs of the criminal justice system. In order to help bridge this gap, the authors' initiative seeks to offer educational opportunities that allow practitioners to acquire knowledge and competence in the current approaches to the evaluation and interpretation of forensic findings. These cover, among other aspects, probabilistic reasoning (including Bayesian networks and other methods of forensic statistics, tools and software), case pre-assessment, skills in the oral and written communication of uncertainty, and the development of independence and self-confidence to solve practical inference problems. E-learning was chosen as a general format because it helps to form a trans-institutional online-community of practitioners from varying forensic disciplines and workfield experience such as reporting officers, (chief) scientists, forensic coordinators, but also lawyers who all can interact directly from their personal workplaces without consideration of distances, travel expenses or time schedules. In the authors' experience, the proposed learning initiative supports participants in developing their expertise and skills in forensic interpretation, but also offers an opportunity for the associated institutions and the forensic community to reinforce the development of a harmonized view with regard to interpretation across forensic disciplines, laboratories and judicial systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Soil penetration resistance (PR) is a measure of soil compaction closely related to soil structure and plant growth. However, the variability in PR hampers the statistical analyses. This study aimed to evaluate the variability of soil PR on the efficiency of parametric and nonparametric analyses in indentifying significant effects of soil compaction and to classify the coefficient of variation of PR into low, medium, high and very high. On six dates, the PR of a typical dystrophic Red Ultisol under continuous no-tillage for 16 years was measured. Three tillage and/or traffic conditions were established with the application of: (i) no chiseling or additional traffic, (ii) additional compaction, and (iii) chiseling. On each date, the nineteen PR data (measured at every 1.5 cm to a depth of 28.5 cm) were grouped in layers with different thickness. In each layer, the treatment effects were evaluated by variance (ANOVA) and Kruskal-Wallis analyses in a completely randomized design, and the coefficients of variation of all analyses were classified (low, intermediate, high and very high). The ANOVA performed better in discriminating the compaction effects, but the rejection rate of null hypothesis decreased from 100 to 80 % when the coefficient of variation increased from 15 to 26 %. The values of 15 and 26 % were the thresholds separating the low/intermediate and the high/very high coefficient variation classes of PR in this Ultisol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the advancement of high-throughput sequencing and dramatic increase of available genetic data, statistical modeling has become an essential part in the field of molecular evolution. Statistical modeling results in many interesting discoveries in the field, from detection of highly conserved or diverse regions in a genome to phylogenetic inference of species evolutionary history Among different types of genome sequences, protein coding regions are particularly interesting due to their impact on proteins. The building blocks of proteins, i.e. amino acids, are coded by triples of nucleotides, known as codons. Accordingly, studying the evolution of codons leads to fundamental understanding of how proteins function and evolve. The current codon models can be classified into three principal groups: mechanistic codon models, empirical codon models and hybrid ones. The mechanistic models grasp particular attention due to clarity of their underlying biological assumptions and parameters. However, they suffer from simplified assumptions that are required to overcome the burden of computational complexity. The main assumptions applied to the current mechanistic codon models are (a) double and triple substitutions of nucleotides within codons are negligible, (b) there is no mutation variation among nucleotides of a single codon and (c) assuming HKY nucleotide model is sufficient to capture essence of transition- transversion rates at nucleotide level. In this thesis, I develop a framework of mechanistic codon models, named KCM-based model family framework, based on holding or relaxing the mentioned assumptions. Accordingly, eight different models are proposed from eight combinations of holding or relaxing the assumptions from the simplest one that holds all the assumptions to the most general one that relaxes all of them. The models derived from the proposed framework allow me to investigate the biological plausibility of the three simplified assumptions on real data sets as well as finding the best model that is aligned with the underlying characteristics of the data sets. -- Avec l'avancement de séquençage à haut débit et l'augmentation dramatique des données géné¬tiques disponibles, la modélisation statistique est devenue un élément essentiel dans le domaine dé l'évolution moléculaire. Les résultats de la modélisation statistique dans de nombreuses découvertes intéressantes dans le domaine de la détection, de régions hautement conservées ou diverses dans un génome de l'inférence phylogénétique des espèces histoire évolutive. Parmi les différents types de séquences du génome, les régions codantes de protéines sont particulièrement intéressants en raison de leur impact sur les protéines. Les blocs de construction des protéines, à savoir les acides aminés, sont codés par des triplets de nucléotides, appelés codons. Par conséquent, l'étude de l'évolution des codons mène à la compréhension fondamentale de la façon dont les protéines fonctionnent et évoluent. Les modèles de codons actuels peuvent être classés en trois groupes principaux : les modèles de codons mécanistes, les modèles de codons empiriques et les hybrides. Les modèles mécanistes saisir une attention particulière en raison de la clarté de leurs hypothèses et les paramètres biologiques sous-jacents. Cependant, ils souffrent d'hypothèses simplificatrices qui permettent de surmonter le fardeau de la complexité des calculs. Les principales hypothèses retenues pour les modèles actuels de codons mécanistes sont : a) substitutions doubles et triples de nucleotides dans les codons sont négligeables, b) il n'y a pas de variation de la mutation chez les nucléotides d'un codon unique, et c) en supposant modèle nucléotidique HKY est suffisant pour capturer l'essence de taux de transition transversion au niveau nucléotidique. Dans cette thèse, je poursuis deux objectifs principaux. Le premier objectif est de développer un cadre de modèles de codons mécanistes, nommé cadre KCM-based model family, sur la base de la détention ou de l'assouplissement des hypothèses mentionnées. En conséquence, huit modèles différents sont proposés à partir de huit combinaisons de la détention ou l'assouplissement des hypothèses de la plus simple qui détient toutes les hypothèses à la plus générale qui détend tous. Les modèles dérivés du cadre proposé nous permettent d'enquêter sur la plausibilité biologique des trois hypothèses simplificatrices sur des données réelles ainsi que de trouver le meilleur modèle qui est aligné avec les caractéristiques sous-jacentes des jeux de données. Nos expériences montrent que, dans aucun des jeux de données réelles, tenant les trois hypothèses mentionnées est réaliste. Cela signifie en utilisant des modèles simples qui détiennent ces hypothèses peuvent être trompeuses et les résultats de l'estimation inexacte des paramètres. Le deuxième objectif est de développer un modèle mécaniste de codon généralisée qui détend les trois hypothèses simplificatrices, tandis que d'informatique efficace, en utilisant une opération de matrice appelée produit de Kronecker. Nos expériences montrent que sur un jeux de données choisis au hasard, le modèle proposé de codon mécaniste généralisée surpasse autre modèle de codon par rapport à AICc métrique dans environ la moitié des ensembles de données. En outre, je montre à travers plusieurs expériences que le modèle général proposé est biologiquement plausible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A systematic assessment of global neural network connectivity through direct electrophysiological assays has remained technically infeasible, even in simpler systems like dissociated neuronal cultures. We introduce an improved algorithmic approach based on Transfer Entropy to reconstruct structural connectivity from network activity monitored through calcium imaging. We focus in this study on the inference of excitatory synaptic links. Based on information theory, our method requires no prior assumptions on the statistics of neuronal firing and neuronal connections. The performance of our algorithm is benchmarked on surrogate time series of calcium fluorescence generated by the simulated dynamics of a network with known ground-truth topology. We find that the functional network topology revealed by Transfer Entropy depends qualitatively on the time-dependent dynamic state of the network (bursting or non-bursting). Thus by conditioning with respect to the global mean activity, we improve the performance of our method. This allows us to focus the analysis to specific dynamical regimes of the network in which the inferred functional connectivity is shaped by monosynaptic excitatory connections, rather than by collective synchrony. Our method can discriminate between actual causal influences between neurons and spurious non-causal correlations due to light scattering artifacts, which inherently affect the quality of fluorescence imaging. Compared to other reconstruction strategies such as cross-correlation or Granger Causality methods, our method based on improved Transfer Entropy is remarkably more accurate. In particular, it provides a good estimation of the excitatory network clustering coefficient, allowing for discrimination between weakly and strongly clustered topologies. Finally, we demonstrate the applicability of our method to analyses of real recordings of in vitro disinhibited cortical cultures where we suggest that excitatory connections are characterized by an elevated level of clustering compared to a random graph (although not extreme) and can be markedly non-local.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the present study was to establish and compare the durations of the seminiferous epithelium cycles of the common shrew Sorex araneus, which is characterized by a high metabolic rate and multiple paternity, and the greater white-toothed shrew Crocidura russula, which is characterized by a low metabolic rate and a monogamous mating system. Twelve S. araneus males and fifteen C. russula males were injected intraperitoneally with 5-bromodeoxyuridine, and the testes were collected. For cycle length determinations, we applied the classical method of estimation and linear regression as a new method. With regard to variance, and even with a relatively small sample size, the new method seems to be more precise. In addition, the regression method allows the inference of information for every animal tested, enabling comparisons of different factors with cycle lengths. Our results show that not only increased testis size leads to increased sperm production, but it also reduces the duration of spermatogenesis. The calculated cycle lengths were 8.35 days for S. araneus and 12.12 days for C. russula. The data obtained in the present study provide the basis for future investigations into the effects of metabolic rate and mating systems on the speed of spermatogenesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper extends previous research [1] on the use of multivariate continuous data in comparative handwriting examinations, notably for gender classification. A database has been constructed by analyzing the contour shape of loop characters of type a and d by means of Fourier analysis, which allows characters to be described in a global way by a set of variables (e.g., Fourier descriptors). Sample handwritings were collected from right- and left-handed female and male writers. The results reported in this paper provide further arguments in support of the view that investigative settings in forensic science represent an area of application for which the Bayesian approach offers a logical framework. In particular, the Bayes factor is computed for settings that focus on inference of gender and handedness of the author of an incriminated handwritten text. An emphasis is placed on comparing the efficiency for investigative purposes of characters a and d.