906 resultados para Synchronous hidden Markov models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, we propose a continuous-time Markov chain model to examine the longitudinal data that have three categories in the outcome variable. The advantage of this model is that it permits a different number of measurements for each subject and the duration between two consecutive time points of measurements can be irregular. Using the maximum likelihood principle, we can estimate the transition probability between two time points. By using the information provided by the independent variables, this model can also estimate the transition probability for each subject. The Monte Carlo simulation method will be used to investigate the goodness of model fitting compared with that obtained from other models. A public health example will be used to demonstrate the application of this method. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present new high-resolution N isotope records from the Gulf of Tehuantepec and the Nicaragua Basin spanning the last 50-70 ka. The Tehuantepec site is situated within the core of the north subtropical denitrification zone while the Nicaragua site is at the southern boundary. The d15N record from Nicaragua shows an 'Antarctic' timing similar to denitrification changes observed off Peru-Chile but is radically different from the northern records. We attribute this to the leakage of isotopically heavy nitrate from the South Pacific oxygen minimum zone (OMZ) into the Nicaragua Basin. The Nicaragua record leads the other eastern tropical North Pacific (ETNP) records by about 1000 years because denitrification peaks in the eastern tropical South Pacific (ETSP) before denitrification starts to increase in the Northern Hemisphere OMZ, i.e., during warming episodes in Antarctica. We find that the influence of the heavy nitrate leakage from the ETSP is still noticeable, although attenuated, in the Gulf of Tehuantepec record, particularly at the end of the Heinrich events, and tends to alter the recording of millennial timescale denitrification changes in the ETNP. This implies (1) that sedimentary d15N records from the southern parts of the ETNP cannot be used straightforwardly as a proxy for local denitrification and (2) that denitrification history in the ETNP, like in the Arabian Sea, is synchronous with Greenland temperature changes. These observations reinforce the conclusion that on millennial timescales during the last ice age, denitrification in the ETNP is strongly influenced by climatic variations that originated in the high-latitude North Atlantic region, while commensurate changes in Southern Ocean hydrography more directly, and slightly earlier, affected oxygen concentrations in the ETSP. Furthermore, the d15N records imply ongoing physical communication across the equator in the shallow subsurface continuously over the last 50-70 ka.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multi-dimensional Bayesian network classifiers (MBCs) are probabilistic graphical models recently proposed to deal with multi-dimensional classification problems, where each instance in the data set has to be assigned to more than one class variable. In this paper, we propose a Markov blanket-based approach for learning MBCs from data. Basically, it consists of determining the Markov blanket around each class variable using the HITON algorithm, then specifying the directionality over the MBC subgraphs. Our approach is applied to the prediction problem of the European Quality of Life-5 Dimensions (EQ-5D) from the 39-item Parkinson’s Disease Questionnaire (PDQ-39) in order to estimate the health-related quality of life of Parkinson’s patients. Fivefold cross-validation experiments were carried out on randomly generated synthetic data sets, Yeast data set, as well as on a real-world Parkinson’s disease data set containing 488 patients. The experimental study, including comparison with additional Bayesian network-based approaches, back propagation for multi-label learning, multi-label k-nearest neighbor, multinomial logistic regression, ordinary least squares, and censored least absolute deviations, shows encouraging results in terms of predictive accuracy as well as the identification of dependence relationships among class and feature variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents some ideas about a new neural network architecture that can be compared to a Taylor analysis when dealing with patterns. Such architecture is based on lineal activation functions with an axo-axonic architecture. A biological axo-axonic connection between two neurons is defined as the weight in a connection in given by the output of another third neuron. This idea can be implemented in the so called Enhanced Neural Networks in which two Multilayer Perceptrons are used; the first one will output the weights that the second MLP uses to computed the desired output. This kind of neural network has universal approximation properties even with lineal activation functions. There exists a clear difference between cooperative and competitive strategies. The former ones are based on the swarm colonies, in which all individuals share its knowledge about the goal in order to pass such information to other individuals to get optimum solution. The latter ones are based on genetic models, that is, individuals can die and new individuals are created combining information of alive one; or are based on molecular/celular behaviour passing information from one structure to another. A swarm-based model is applied to obtain the Neural Network, training the net with a Particle Swarm algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The so-called parallel multisplitting nonstationary iterative Model A was introduced by Bru, Elsner, and Neumann [Linear Algebra and its Applications 103:175-192 (1988)] for solving a nonsingular linear system Ax = b using a weak nonnegative multisplitting of the first type. In this paper new results are introduced when A is a monotone matrix using a weak nonnegative multisplitting of the second type and when A is a symmetric positive definite matrix using a P -regular multisplitting. Also, nonstationary alternating iterative methods are studied. Finally, combining Model A and alternating iterative methods, two new models of parallel multisplitting nonstationary iterations are introduced. When matrix A is monotone and the multisplittings are weak nonnegative of the first or of the second type, both models lead to convergent schemes. Also, when matrix A is symmetric positive definite and the multisplittings are P -regular, the schemes are also convergent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Authors discuss the effects that economic crises generate on the global market shares of tourism destinations, through a series of potential transmission mechanisms based on the main economic competitiveness determinants identified in the previous literature using a non-linear approach. Specifically a Markov Switching Regression approach is used to estimate the effect of two basic transmission mechanisms: reductions of internal and external tourism demands and falling investment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The observation of several neutron stars in the centre of supernova remnants and with significantly lower values of the dipolar magnetic field than the average radio-pulsar population has motivated a lively debate about their formation and origin, with controversial interpretations. A possible explanation requires the slow rotation of the protoneutron star at birth, which is unable to amplify its magnetic field to typical pulsar levels. An alternative possibility, the hidden magnetic field scenario, considers the accretion of the fallback of the supernova debris on to the neutron star as responsible for the submergence (or screening) of the field and its apparently low value. In this paper, we study under which conditions the magnetic field of a neutron star can be buried into the crust due to an accreting, conducting fluid. For this purpose, we consider a spherically symmetric calculation in general relativity to estimate the balance between the incoming accretion flow and the magnetosphere. Our study analyses several models with different specific entropy, composition, and neutron star masses. The main conclusion of our work is that typical magnetic fields of a few times 1012 G can be buried by accreting only 10−3–10−2 M⊙, a relatively modest amount of mass. In view of this result, the central compact object scenario should not be considered unusual, and we predict that anomalously weak magnetic fields should be common in very young (< few kyr) neutron stars.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New high-precision niobium (Nb) and tantalum (Ta) concentration data are presented for early Archaean metabasalts, metabasaltic komatiites and their erosion products (mafic metapelites) from SW Greenland and the Acasta gneiss complex, Canada. Individual datasets consistently show sub-chondritic Nb/Ta ratios averaging 15.1+/-11.6. This finding is discussed with regard to two competing models for the solution of the Nb-deficit that characterises the accessible Earth. Firstly, we test whether Nb could have sequestered into the core due to its slightly siderophile (or chalcophile) character under very reducing conditions, as recently proposed from experimental evidence. We demonstrate that troilite inclusions of the Canyon Diablo iron meteorite have Nb and V concentrations in excess of typical chondrites but that the metal phase of the Grant, Toluca and Canyon Diablo iron meteorites do not have significant concentrations of these lithophile elements. We find that if the entire accessible Earth Nb-deficit were explained by Nb in the core, only ca. 17% of the mantle could be depleted and that by 3.7 Ga, continental crust would have already achieved ca. 50% of its present mass. Nb/Ta systematics of late Archaean metabasalts compiled from the literature would further require that by 2.5 Ga, 90% of the present mass of continental crust was already in existence. As an alternative to this explanation, we propose that the average Nb/Ta ratio (15.1+/-11.6) of Earth's oldest mafic rocks is a valid approximation for bulk silicate Earth. This would require that ca. 13% of the terrestrial Nb resided in the Ta-free core. Since the partitioning of Nb between silicate and metal melts depends largely on oxygen fugacity and pressure, this finding could mean that metal/silicate segregation did not occur at the base of a deep magma ocean or that the early mantle was slightly less reducing than generally assumed. A bulk silicate Earth Nb/Ta ratio of 15.1 allows for depletion of up to 40% of the total mantle. This could indicate that in addition to the upper mantle, a portion of the lower mantle is depleted also, or if only the upper mantle were depleted, an additional hidden high Nb/Ta reservoir must exist. Comparison of Nb/Ta systematics between early and late Archaean metabasalts supports the latter idea and indicates deeply subducted high Nb/Ta eclogite slabs could reside in the mantle transition zone or the lower mantle. Accumulation of such slabs appears to have commenced between 2.5 and 2.0 Ga. Regardless of these complexities of terrestrial Nb/Ta systematics, it is shown that the depleted mantle Nb/Th ratio is a very robust proxy for the amount of extracted continental crust, because the temporal evolution of this ratio is dominated by Th-loss to the continents and not Nb-retention in the mantle. We present a new parameterisation of the continental crust volume versus age curve that specifically explores the possibility of lithophile element loss to the core and storage of eclogite slabs in the transition zone. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A recent development of the Markov chain Monte Carlo (MCMC) technique is the emergence of MCMC samplers that allow transitions between different models. Such samplers make possible a range of computational tasks involving models, including model selection, model evaluation, model averaging and hypothesis testing. An example of this type of sampler is the reversible jump MCMC sampler, which is a generalization of the Metropolis-Hastings algorithm. Here, we present a new MCMC sampler of this type. The new sampler is a generalization of the Gibbs sampler, but somewhat surprisingly, it also turns out to encompass as particular cases all of the well-known MCMC samplers, including those of Metropolis, Barker, and Hastings. Moreover, the new sampler generalizes the reversible jump MCMC. It therefore appears to be a very general framework for MCMC sampling. This paper describes the new sampler and illustrates its use in three applications in Computational Biology, specifically determination of consensus sequences, phylogenetic inference and delineation of isochores via multiple change-point analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate whether relative contributions of genetic and shared environmental factors are associated with an increased risk in melanoma. Data from the Queensland Familial Melanoma Project comprising 15,907 subjects arising from 1912 families were analyzed to estimate the additive genetic, common and unique environmental contributions to variation in the age at onset of melanoma. Two complementary approaches for analyzing correlated time-to-onset family data were considered: the generalized estimating equations (GEE) method in which one can estimate relationship-specific dependence simultaneously with regression coefficients that describe the average population response to changing covariates; and a subject-specific Bayesian mixed model in which heterogeneity in regression parameters is explicitly modeled and the different components of variation may be estimated directly. The proportional hazards and Weibull models were utilized, as both produce natural frameworks for estimating relative risks while adjusting for simultaneous effects of other covariates. A simple Markov Chain Monte Carlo method for covariate imputation of missing data was used and the actual implementation of the Bayesian model was based on Gibbs sampling using the free ware package BUGS. In addition, we also used a Bayesian model to investigate the relative contribution of genetic and environmental effects on the expression of naevi and freckles, which are known risk factors for melanoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Let (Phi(t))(t is an element of R+) be a Harris ergodic continuous-time Markov process on a general state space, with invariant probability measure pi. We investigate the rates of convergence of the transition function P-t(x, (.)) to pi; specifically, we find conditions under which r(t) vertical bar vertical bar P-t (x, (.)) - pi vertical bar vertical bar -> 0 as t -> infinity, for suitable subgeometric rate functions r(t), where vertical bar vertical bar - vertical bar vertical bar denotes the usual total variation norm for a signed measure. We derive sufficient conditions for the convergence to hold, in terms of the existence of suitable points on which the first hitting time moments are bounded. In particular, for stochastically ordered Markov processes, explicit bounds on subgeometric rates of convergence are obtained. These results are illustrated in several examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We derive necessary and sufficient conditions for the existence of bounded or summable solutions to systems of linear equations associated with Markov chains. This substantially extends a famous result of G. E. H. Reuter, which provides a convenient means of checking various uniqueness criteria for birth-death processes. Our result allows chains with much more general transition structures to be accommodated. One application is to give a new proof of an important result of M. F. Chen concerning upwardly skip-free processes. We then use our generalization of Reuter's lemma to prove new results for downwardly skip-free chains, such as the Markov branching process and several of its many generalizations. This permits us to establish uniqueness criteria for several models, including the general birth, death, and catastrophe process, extended branching processes, and asymptotic birth-death processes, the latter being neither upwardly skip-free nor downwardly skip-free.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper has three primary aims: to establish an effective means for modelling mainland-island metapopulations inhabiting a dynamic landscape: to investigate the effect of immigration and dynamic changes in habitat on metapopulation patch occupancy dynamics; and to illustrate the implications of our results for decision-making and population management. We first extend the mainland-island metapopulation model of Alonso and McKane [Bull. Math. Biol. 64:913-958,2002] to incorporate a dynamic landscape. It is shown, for both the static and the dynamic landscape models, that a suitably scaled version of the process converges to a unique deterministic model as the size of the system becomes large. We also establish that. under quite general conditions, the density of occupied patches, and the densities of suitable and occupied patches, for the respective models, have approximate normal distributions. Our results not only provide us with estimates for the means and variances that are valid at all stages in the evolution of the population, but also provide a tool for fitting the models to real metapopulations. We discuss the effect of immigration and habitat dynamics on metapopulations, showing that mainland-like patches heavily influence metapopulation persistence, and we argue for adopting measures to increase connectivity between this large patch and the other island-like patches. We illustrate our results with specific reference to examples of populations of butterfly and the grasshopper Bryodema tuberculata.