954 resultados para Stochastic Differential Equations, Parameter Estimation, Maximum Likelihood, Simulation, Moments
Resumo:
The generalized Gibbs sampler (GGS) is a recently developed Markov chain Monte Carlo (MCMC) technique that enables Gibbs-like sampling of state spaces that lack a convenient representation in terms of a fixed coordinate system. This paper describes a new sampler, called the tree sampler, which uses the GGS to sample from a state space consisting of phylogenetic trees. The tree sampler is useful for a wide range of phylogenetic applications, including Bayesian, maximum likelihood, and maximum parsimony methods. A fast new algorithm to search for a maximum parsimony phylogeny is presented, using the tree sampler in the context of simulated annealing. The mathematics underlying the algorithm is explained and its time complexity is analyzed. The method is tested on two large data sets consisting of 123 sequences and 500 sequences, respectively. The new algorithm is shown to compare very favorably in terms of speed and accuracy to the program DNAPARS from the PHYLIP package.
Resumo:
A mixture model for long-term survivors has been adopted in various fields such as biostatistics and criminology where some individuals may never experience the type of failure under study. It is directly applicable in situations where the only information available from follow-up on individuals who will never experience this type of failure is in the form of censored observations. In this paper, we consider a modification to the model so that it still applies in the case where during the follow-up period it becomes known that an individual will never experience failure from the cause of interest. Unless a model allows for this additional information, a consistent survival analysis will not be obtained. A partial maximum likelihood (ML) approach is proposed that preserves the simplicity of the long-term survival mixture model and provides consistent estimators of the quantities of interest. Some simulation experiments are performed to assess the efficiency of the partial ML approach relative to the full ML approach for survival in the presence of competing risks.
Resumo:
Hemichordates were traditionally allied to the chordates, but recent molecular analyses have suggested that hemichordates are a sister group to the echinoderms, a relationship that has important consequences for the interpretation of the evolution of deuterostome body plans. However, the molecular phylogenetic analyses to date have not provided robust support for the hemichordate + echinoderm clade. We use a maximum likelihood framework, including the parametric bootstrap, to reanalyze DNA data from complete mitochondrial genomes and nuclear 18S rRNA. This approach provides the first statistically significant support for the hemichordate + echinoderm clade from molecular data. This grouping implies that the ancestral deuterostome had features that included an adult with a pharynx and a dorsal nerve cord and an indirectly developing dipleurula-like larva.
Resumo:
Objective: To measure prevalence and model incidence of HIV infection. Setting: 2013 consecutive pregnant women attending public sector antenatal clinics in 1997 in Hlabisa health district, South Africa. Historical seroprevalence data, 1992-1995. Methods: Serum remaining from syphilis testing was tested anonymously for antibodies to HIV to determine seroprevalence. Two models, allowing for differential mortality between HIV-positive and HIV-negative people, were used. The first used serial seroprevalence data to estimate trends in annual incidence. The second, a maximum likelihood model, took account of changing force of infection and age-dependent risk of infection, to estimate age-specific HIV incidence in 1997. Multiple logistic regression provided adjusted odds ratios (OR) for risk factors for prevalent HIV infection. Results: Estimated annual HIV incidence increased from 4% in 1992/1993 to 10% in 1996/1997. In 1997, highest age-specific incidence was 16% among women aged between 20 and 24 years. in 1997, overall prevalence was 26% (95% confidence interval [CI], 24%-28%) and at 34% was highest among women aged between 20 and 24 years. Young age (<30 years; odds ratio [OR], 2.1; p = .001), unmarried status (OR 2.2; p = .001) and living in less remote parts of the district (OR 1.5; p = .002) were associated with HIV prevalence in univariate analysis. Associations were less strong in multivariate analysis. Partner's migration status was not associated with HIV infection. Substantial heterogeneity of HIV prevalence by clinic was observed (range 17%-31%; test for trend, p = .001). Conclusions: This community is experiencing an explosive HIV epidemic. Young, single women in the more developed parts of the district would form an appropriate cohort to test, and benefit from, interventions such as vaginal microbicides and HIV vaccines.
Resumo:
Normal mixture models are being increasingly used to model the distributions of a wide variety of random phenomena and to cluster sets of continuous multivariate data. However, for a set of data containing a group or groups of observations with longer than normal tails or atypical observations, the use of normal components may unduly affect the fit of the mixture model. In this paper, we consider a more robust approach by modelling the data by a mixture of t distributions. The use of the ECM algorithm to fit this t mixture model is described and examples of its use are given in the context of clustering multivariate data in the presence of atypical observations in the form of background noise.
Resumo:
The Yang-Mills-Higgs field generalizes the Yang-Mills field. The authors establish the local existence and uniqueness of the weak solution to the heat flow for the Yang-Mills-Higgs field in a vector bundle over a compact Riemannian 4-manifold, and show that the weak solution is gauge-equivalent to a smooth solution and there are at most finite singularities at the maximum existing time.
Resumo:
We demonstrate complete characterization of a two-qubit entangling process-a linear optics controlled-NOT gate operating with coincident detection-by quantum process tomography. We use a maximum-likelihood estimation to convert the experimental data into a physical process matrix. The process matrix allows an accurate prediction of the operation of the gate for arbitrary input states and a calculation of gate performance measures such as the average gate fidelity, average purity, and entangling capability of our gate, which are 0.90, 0.83, and 0.73, respectively.
Resumo:
A bounded continuous function it u : [0, infinity) -> X is said to be S-asymptotically omega-periodic if lim(t ->infinity)[u(t + omega) - u(t)] = 0. This paper is devoted to study the existence and qualitative properties of S-asymptotically omega-periodic mild solutions for some classes of abstract neutral functional differential equations with infinite delay, Furthermore, applications to partial differential equations are given.
Resumo:
In this paper use consider the problem of providing standard errors of the component means in normal mixture models fitted to univariate or multivariate data by maximum likelihood via the EM algorithm. Two methods of estimation of the standard errors are considered: the standard information-based method and the computationally-intensive bootstrap method. They are compared empirically by their application to three real data sets and by a small-scale Monte Carlo experiment.
Resumo:
The small sample performance of Granger causality tests under different model dimensions, degree of cointegration, direction of causality, and system stability are presented. Two tests based on maximum likelihood estimation of error-correction models (LR and WALD) are compared to a Wald test based on multivariate least squares estimation of a modified VAR (MWALD). In large samples all test statistics perform well in terms of size and power. For smaller samples, the LR and WALD tests perform better than the MWALD test. Overall, the LR test outperforms the other two in terms of size and power in small samples.
Resumo:
It is shown that coherent quantum simultons (simultaneous solitary waves at two different frequencies) can undergo quadrature-phase squeezing as they propagate through a dispersive chi((2)) waveguide. This requires a treatment of the coupled quantized fields including a quantized depleted pump field. A technique involving nonlinear stochastic parabolic partial differential equations using a nondiagonal coherent state representation in combination with an exact Wigner representation on a reduced phase space is outlined. We explicitly demonstrate that group-velocity matched chi((2)) waveguides which exhibit collinear propagation can produce quadrature-phase squeezed simultons. Quasi-phase-matched KTP waveguides, even with their large group-velocity mismatch between fundamental and second harmonic at 425 nm, can produce 3 dB squeezed bright pulses at 850 nm in the large phase-mismatch regime. This can be improved to more than 6 dB by using group-velocity matched waveguides.
Resumo:
Objective: The aim of this study was to test the effectiveness of various attitude-behavior theories in explaining alcohol use among young adults. The theory of reasoned action (TRA), the theory of planned behavior and an extension of the TRA that incorporates past behavior were compared by the method of maximum-likelihood estimation, as implemented in LISREL for Windows 8.12. Method: Respondents consisted of 122 university students (82 female) who were questioned about their attitudes, subjective norms, perceived behavioral control, past behavior and intentions relating to drinking behavior. Students received course credit for their participation in the research. Results: Overall, the results suggest that the extension of the theory of reasoned action which incorporates past behavior provides the best fit to the data. For these young adults, their intentions to drink alcohol were predicted by their past behavior as well as their perceptions of what important others think they should do (subjective norm). Conclusions: The main conclusions drawn from the research concern the importance of focusing on normative influences and past behavior in explaining young adult alcohol use. Issues regarding the relative merit of various alternative models and the need for greater clarity in the measure of attitudes are also discussed.
Resumo:
Analysis of a major multi-site epidemiologic study of heart disease has required estimation of the pairwise correlation of several measurements across sub-populations. Because the measurements from each sub-population were subject to sampling variability, the Pearson product moment estimator of these correlations produces biased estimates. This paper proposes a model that takes into account within and between sub-population variation, provides algorithms for obtaining maximum likelihood estimates of these correlations and discusses several approaches for obtaining interval estimates. (C) 1997 by John Wiley & Sons, Ltd.
Resumo:
HE PROBIT MODEL IS A POPULAR DEVICE for explaining binary choice decisions in econometrics. It has been used to describe choices such as labor force participation, travel mode, home ownership, and type of education. These and many more examples can be found in papers by Amemiya (1981) and Maddala (1983). Given the contribution of economics towards explaining such choices, and given the nature of data that are collected, prior information on the relationship between a choice probability and several explanatory variables frequently exists. Bayesian inference is a convenient vehicle for including such prior information. Given the increasing popularity of Bayesian inference it is useful to ask whether inferences from a probit model are sensitive to a choice between Bayesian and sampling theory techniques. Of interest is the sensitivity of inference on coefficients, probabilities, and elasticities. We consider these issues in a model designed to explain choice between fixed and variable interest rate mortgages. Two Bayesian priors are employed: a uniform prior on the coefficients, designed to be noninformative for the coefficients, and an inequality restricted prior on the signs of the coefficients. We often know, a priori, whether increasing the value of a particular explanatory variable will have a positive or negative effect on a choice probability. This knowledge can be captured by using a prior probability density function (pdf) that is truncated to be positive or negative. Thus, three sets of results are compared:those from maximum likelihood (ML) estimation, those from Bayesian estimation with an unrestricted uniform prior on the coefficients, and those from Bayesian estimation with a uniform prior truncated to accommodate inequality restrictions on the coefficients.
Resumo:
The catalytic properties of enzymes are usually evaluated by measuring and analyzing reaction rates. However, analyzing the complete time course can be advantageous because it contains additional information about the properties of the enzyme. Moreover, for systems that are not at steady state, the analysis of time courses is the preferred method. One of the major barriers to the wide application of time courses is that it may be computationally more difficult to extract information from these experiments. Here the basic approach to analyzing time courses is described, together with some examples of the essential computer code to implement these analyses. A general method that can be applied to both steady state and non-steady-state systems is recommended. (C) 2001 academic Press.