910 resultados para Bayesian maximum entropy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

To evaluate the effect of pregnancy and smoking on endothelial function using brachial artery flow-mediated dilation (FMD) and to determine the time necessary until the occurrence of maximum brachial artery dilation after stimulus. This study was an observational study evaluating 133 women, who were grouped as follows: non-smoking pregnant women (N = 47), smoking pregnant women (N = 33), non-smoking women (N = 34), and smoking pregnant women (N = 19). The diameter of the brachial artery was measured at baseline and at 30, 60, 90 and 120 s after stimulus. The relative change of brachial artery was determined for each of these four moments. FMD measured at 60 s after stimulus was compared between the groups. The maximum FMD was observed at 60 s after cuff release in all groups. FMD was greater among non-smoking pregnant women compared to smoking pregnant women (11.50 +/- A 5.77 vs. 8.74 +/- A 4.83; p = 0.03) and also between non-smoking non-pregnant women compared to smoking non-pregnant women (10.52 +/- A 4.76 vs. 7.21 +/- A 5.57; p = 0.03). Maximum FMD was observed approximately 60 s after stimulus in all groups regardless of smoking and pregnancy status. The smoking habit seems to lead to endothelial dysfunction both in pregnant and non-pregnant women, as demonstrated by the lower FMD in smokers.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to analyze the electromyographic (EMG) data, before and after normalization. One hundred (100) normal subjects (with no signs and symptoms of temporomandibular disorders) participated in this study. A surface EMG of the masticatory muscles was performed. Two different tests were performed: maximum voluntary clench (MVC) on cotton rolls and MVC in intercuspal position. The normalization was done using the mean value of the EMG signal of the first examination. The coefficient of variation CV showed lower values for the standardized data. The standardization was effective in reducing the differences between records from the same subject and in different subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research diagnostic criteria for temporomandibular disorders (RDC/TMD) are used for the classification of patients with temporomandibular disorders (TMD). Surface electromyography of the right and left masseter and temporalis muscles was performed during Maximum teeth clenching in 103 TMD patients subdivided according to the RDC/TMD into 3 non-overlapping groups: (a) 25 myogenous; (b) 61 arthrogenous; and (c) 17 psycogenous patients. Thirty-two control subjects matched for sex and age were also measured. During clenching, standardized total muscle activities (electromyographic potentials over time) significantly differed: 131.7 mu V/mu V s % in the normal subjects, 117.6 mu V/mu V s % in the myogenous patients, 105.3 mu V/mu V s % in the arthrogenous patients, 88.7 mu V/mu V s % in the psycogenous patients (p < 0.001, analysis of covariance). Symmetry in the temporalis muscles was larger in normal subjects (86.3%) and in myogenous patients (84.9%) than in arthrogenous (82.7%), and psycogenous patients (80.5%) (p=0.041). No differences were found for masseter muscle symmetry and torque coefficient (p>0.05). Surface electromyography of the masticatory muscles allowed an objective discrimination among different RDC/TMD subgroups. This evaluation could assist conventional clinical assessments. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe in detail the theory underpinning the measurement of density matrices of a pair of quantum two-level systems (qubits). Our particular emphasis is on qubits realized by the two polarization degrees of freedom of a pair of entangled photons generated in a down-conversion experiment; however, the discussion applies in general, regardless of the actual physical realization. Two techniques are discussed, namely, a tomographic reconstruction (in which the density matrix is linearly related to a set of measured quantities) and a maximum likelihood technique which requires numerical optimization (but has the advantage of producing density matrices that are always non-negative definite). In addition, a detailed error analysis is presented, allowing errors in quantities derived from the density matrix, such as the entropy or entanglement of formation, to be estimated. Examples based on down-conversion experiments are used to illustrate our results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we study an astonishing similarity between the utility representation problem in economics and the entropy representation problem in thermodynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Binning and truncation of data are common in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (Biometrics, 44: 2, 571-578, 1988) for the univariate case is generalized to multivariate measurements. The multivariate solution requires the evaluation of multidimensional integrals over each bin at each iteration of the EM procedure. Naive implementation of the procedure can lead to computationally inefficient results. To reduce the computational cost a number of straightforward numerical techniques are proposed. Results on simulated data indicate that the proposed methods can achieve significant computational gains with no loss in the accuracy of the final parameter estimates. Furthermore, experimental results suggest that with a sufficient number of bins and data points it is possible to estimate the true underlying density almost as well as if the data were not binned. The paper concludes with a brief description of an application of this approach to diagnosis of iron deficiency anemia, in the context of binned and truncated bivariate measurements of volume and hemoglobin concentration from an individual's red blood cells.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare two different approaches to the control of the dynamics of a continuously monitored open quantum system. The first is Markovian feedback, as introduced in quantum optics by Wiseman and Milburn [Phys. Rev. Lett. 70, 548 (1993)]. The second is feedback based on an estimate of the system state, developed recently by Doherty and Jacobs [Phys. Rev. A 60, 2700 (1999)]. Here we choose to call it, for brevity, Bayesian feedback. For systems with nonlinear dynamics, we expect these two methods of feedback control to give markedly different results. The simplest possible nonlinear system is a driven and damped two-level atom, so we choose this as our model system. The monitoring is taken to be homodyne detection of the atomic fluorescence, and the control is by modulating the driving. The aim of the feedback in both cases is to stabilize the internal state of the atom as close as possible to an arbitrarily chosen pure state, in the presence of inefficient detection and other forms of decoherence. Our results (obtained without recourse to stochastic simulations) prove that Bayesian feedback is never inferior, and is usually superior, to Markovian feedback. However, it would be far more difficult to implement than Markovian feedback and it loses its superiority when obvious simplifying approximations are made. It is thus not clear which form of feedback would be better in the face of inevitable experimental imperfections.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To compare the population modelling programs NONMEM and P-PHARM during investigation of the pharmacokinetics of tacrolimus in paediatric liver-transplant recipients. Methods: Population pharmacokinetic analysis was performed using NONMEM and P-PHARM on retrospective data from 35 paediatric liver-transplant patients receiving tacrolimus therapy. The same data were presented to both programs. Maximum likelihood estimates were sought for apparent clearance (CL/F) and apparent volume of distribution (V/F). Covariates screened for influence on these parameters were weight, age, gender, post-operative day, days of tacrolimus therapy, transplant type, biliary reconstructive procedure, liver function tests, creatinine clearance, haematocrit, corticosteroid dose, and potential interacting drugs. Results: A satisfactory model was developed in both programs with a single categorical covariate - transplant type - providing stable parameter estimates and small, normally distributed (weighted) residuals. In NONMEM, the continuous covariates - age and liver function tests - improved modelling further. Mean parameter estimates were CL/F (whole liver) = 16.3 1/h, CL/F (cut-down liver) = 8.5 1/h and V/F = 565 1 in NONMEM, and CL/F = 8.3 1/h and V/F = 155 1 in P-PHARM. Individual Bayesian parameter estimates were CL/F (whole liver) = 17.9 +/- 8.8 1/h, CL/F (cutdown liver) = 11.6 +/- 18.8 1/h and V/F = 712 792 1 in NONMEM, and CL/F (whole liver) = 12.8 +/- 3.5 1/h, CL/F (cut-down liver) = 8.2 +/- 3.4 1/h and V/F = 221 1641 in P-PHARM. Marked interindividual kinetic variability (38-108%) and residual random error (approximately 3 ng/ml) were observed. P-PHARM was more user friendly and readily provided informative graphical presentation of results. NONMEM allowed a wider choice of errors for statistical modelling and coped better with complex covariate data sets. Conclusion: Results from parametric modelling programs can vary due to different algorithms employed to estimate parameters, alternative methods of covariate analysis and variations and limitations in the software itself.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We compare Bayesian methodology utilizing free-ware BUGS (Bayesian Inference Using Gibbs Sampling) with the traditional structural equation modelling approach based on another free-ware package, Mx. Dichotomous and ordinal (three category) twin data were simulated according to different additive genetic and common environment models for phenotypic variation. Practical issues are discussed in using Gibbs sampling as implemented by BUGS to fit subject-specific Bayesian generalized linear models, where the components of variation may be estimated directly. The simulation study (based on 2000 twin pairs) indicated that there is a consistent advantage in using the Bayesian method to detect a correct model under certain specifications of additive genetics and common environmental effects. For binary data, both methods had difficulty in detecting the correct model when the additive genetic effect was low (between 10 and 20%) or of moderate range (between 20 and 40%). Furthermore, neither method could adequately detect a correct model that included a modest common environmental effect (20%) even when the additive genetic effect was large (50%). Power was significantly improved with ordinal data for most scenarios, except for the case of low heritability under a true ACE model. We illustrate and compare both methods using data from 1239 twin pairs over the age of 50 years, who were registered with the Australian National Health and Medical Research Council Twin Registry (ATR) and presented symptoms associated with osteoarthritis occurring in joints of the hand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complete small subunit ribosomal RNA gene (ssrDNA) and partial (D1-D3) large subunit ribosomal RNA gene (lsrDNA) sequences were used to estimate the phylogeny of the Digenea via maximum parsimony and Bayesian inference. Here we contribute 80 new ssrDNA and 124 new lsrDNA sequences. Fully complementary data sets of the two genes were assembled from newly generated and previously published sequences and comprised 163 digenean taxa representing 77 nominal families and seven aspidogastrean outgroup taxa representing three families. Analyses were conducted on the genes independently as well as combined and separate analyses including only the higher plagiorchiidan taxa were performed using a reduced-taxon alignment including additional characters that could not be otherwise unambiguously aligned. The combined data analyses yielded the most strongly supported results and differences between the two methods of analysis were primarily in their degree of resolution. The Bayesian analysis including all taxa and characters, and incorporating a model of nucleotide substitution (general-time-reversible with among-site rate heterogeneity), was considered the best estimate of the phylogeny and was used to evaluate their classification and evolution. In broad terms, the Digenea forms a dichotomy that is split between a lineage leading to the Brachylaimoidea, Diplostomoidea and Schistosomatoidea (collectively the Diplostomida nomen novum (nom. nov.)) and the remainder of the Digenea (the Plagiorchiida), in which the Bivesiculata nom. nov. and Transversotremata nom. nov. form the two most basal lineages, followed by the Hemiurata. The remainder of the Plagiorchiida forms a large number of independent lineages leading to the crown clade Xiphidiata nom. nov. that comprises the Allocreadioidea, Gorgoderoidea, Microphalloidea and Plagiorchioidea, which are united by the presence of a penetrating stylet in their cercariae. Although a majority of families and to a lesser degree, superfamilies are supported as currently defined, the traditional divisions of the Echinostomida, Plagiorchiida and Strigeida were found to comprise non-natural assemblages. Therefore, the membership of established higher taxa are emended, new taxa erected and a revised, phylogenetically based classification proposed and discussed in light of ontogeny, morphology and taxonomic history. (C) 2003 Australian Society for Parasitology Inc. Published by Elsevier Science Ltd. All rights reserved.