216 resultados para hierarchical hidden Markov model
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
We propose and analyze two different Bayesian online algorithms for learning in discrete Hidden Markov Models and compare their performance with the already known Baldi-Chauvin Algorithm. Using the Kullback-Leibler divergence as a measure of generalization we draw learning curves in simplified situations for these algorithms and compare their performances.
Resumo:
Schistosoma mansoni is responsible for the neglected tropical disease schistosomiasis that affects 210 million people in 76 countries. Here we present analysis of the 363 megabase nuclear genome of the blood fluke. It encodes at least 11,809 genes, with an unusual intron size distribution, and new families of micro-exon genes that undergo frequent alternative splicing. As the first sequenced flatworm, and a representative of the Lophotrochozoa, it offers insights into early events in the evolution of the animals, including the development of a body pattern with bilateral symmetry, and the development of tissues into organs. Our analysis has been informed by the need to find new drug targets. The deficits in lipid metabolism that make schistosomes dependent on the host are revealed, and the identification of membrane receptors, ion channels and more than 300 proteases provide new insights into the biology of the life cycle and new targets. Bioinformatics approaches have identified metabolic chokepoints, and a chemogenomic screen has pinpointed schistosome proteins for which existing drugs may be active. The information generated provides an invaluable resource for the research community to develop much needed new control tools for the treatment and eradication of this important and neglected disease.
Resumo:
Large-conductance Ca(2+)-activated K(+) channels (BK) play a fundamental role in modulating membrane potential in many cell types. The gating of BK channels and its modulation by Ca(2+) and voltage has been the subject of intensive research over almost three decades, yielding several of the most complicated kinetic mechanisms ever proposed. A large number of open and closed states disposed, respectively, in two planes, named tiers, characterize these mechanisms. Transitions between states in the same plane are cooperative and modulated by Ca(2+). Transitions across planes are highly concerted and voltage-dependent. Here we reexamine the validity of the two-tiered hypothesis by restricting attention to the modulation by Ca(2+). Large single channel data sets at five Ca(2+) concentrations were simultaneously analyzed from a Bayesian perspective by using hidden Markov models and Markov-chain Monte Carlo stochastic integration techniques. Our results support a dramatic reduction in model complexity, favoring a simple mechanism derived from the Monod-Wyman-Changeux allosteric model for homotetramers, able to explain the Ca(2+) modulation of the gating process. This model differs from the standard Monod-Wyman-Changeux scheme in that one distinguishes when two Ca(2+) ions are bound to adjacent or diagonal subunits of the tetramer.
Resumo:
Online music databases have increased significantly as a consequence of the rapid growth of the Internet and digital audio, requiring the development of faster and more efficient tools for music content analysis. Musical genres are widely used to organize music collections. In this paper, the problem of automatic single and multi-label music genre classification is addressed by exploring rhythm-based features obtained from a respective complex network representation. A Markov model is built in order to analyse the temporal sequence of rhythmic notation events. Feature analysis is performed by using two multi-variate statistical approaches: principal components analysis (unsupervised) and linear discriminant analysis (supervised). Similarly, two classifiers are applied in order to identify the category of rhythms: parametric Bayesian classifier under the Gaussian hypothesis (supervised) and agglomerative hierarchical clustering (unsupervised). Qualitative results obtained by using the kappa coefficient and the obtained clusters corroborated the effectiveness of the proposed method.
Resumo:
O objetivo deste trabalho foi verificar as relações entre fatores socioeconômicos, ambientais e biológicos com a hipertensão, segundo gênero. A população estudada foi formada por adultos residentes em dois municípios do Vale do Paraíba (SP), uma das regiões mais pobres do estado de São Paulo. Foi composta por 274 (39,8%) homens e 415 (60,2 %) mulheres. O estudo foi realizado por meio de um modelo de regressão logística hierarquizada, aplicado separadamente para homens e mulheres. Foram estimados os odds ratios ajustados (ORaj), com intervalo de confiança de 95% e a = 0,05. Para os homens, os seguintes fatores de risco estiveram associados à hipertensão: viver na zona rural (ORaj=2,00; p=0,01); etilismo (ORaj= 1,90; p=0,03) e idade acima de 40 anos (ORaj=3,10; p<0,0001). Famílias numerosas, com mais de seis pessoas exerceram efeito protetor (ORaj=0,46; p=0,02). Para mulheres, os fatores de risco associados foram: ausência de escolaridade (ORaj= 2,37; p=0,0003); sedentarismo (ORaj=1,71; p=0,04); obesidade acompanhada de baixa estatura (ORaj= 4,66; p <0,0001) e idade acima de 40 anos ( ORaj=5,29; p=0,01). A obesidade isolada não se associou à hipertensão, nos níveis pressóricos iguais ou maiores do que os correspondentes ao estágio II do padrão de referência.
Resumo:
In this paper, we present an analog of Bell's inequalities violation test for N qubits to be performed in a nuclear magnetic resonance (NMR) quantum computer. This can be used to simulate or predict the results for different Bell's inequality tests, with distinct configurations and a larger number of qubits. To demonstrate our scheme, we implemented a simulation of the violation of the Clauser, Horne, Shimony and Holt (CHSH) inequality using a two-qubit NMR system and compared the results to those of a photon experiment. The experimental results are well described by the quantum mechanics theory and a local realistic hidden variables model (LRHVM) that was specifically developed for NMR. That is why we refer to this experiment as a simulation of Bell's inequality violation. Our result shows explicitly how the two theories can be compatible with each other due to the detection loophole. In the last part of this work, we discuss the possibility of testing some fundamental features of quantum mechanics using NMR with highly polarized spins, where a strong discrepancy between quantum mechanics and hidden variables models can be expected.
Resumo:
When building genetic maps, it is necessary to choose from several marker ordering algorithms and criteria, and the choice is not always simple. In this study, we evaluate the efficiency of algorithms try (TRY), seriation (SER), rapid chain delineation (RCD), recombination counting and ordering (RECORD) and unidirectional growth (UG), as well as the criteria PARF (product of adjacent recombination fractions), SARF (sum of adjacent recombination fractions), SALOD (sum of adjacent LOD scores) and LHMC (likelihood through hidden Markov chains), used with the RIPPLE algorithm for error verification, in the construction of genetic linkage maps. A linkage map of a hypothetical diploid and monoecious plant species was simulated containing one linkage group and 21 markers with fixed distance of 3 cM between them. In all, 700 F(2) populations were randomly simulated with and 400 individuals with different combinations of dominant and co-dominant markers, as well as 10 and 20% of missing data. The simulations showed that, in the presence of co-dominant markers only, any combination of algorithm and criteria may be used, even for a reduced population size. In the case of a smaller proportion of dominant markers, any of the algorithms and criteria (except SALOD) investigated may be used. In the presence of high proportions of dominant markers and smaller samples (around 100), the probability of repulsion linkage increases between them and, in this case, use of the algorithms TRY and SER associated to RIPPLE with criterion LHMC would provide better results. Heredity (2009) 103, 494-502; doi:10.1038/hdy.2009.96; published online 29 July 2009
Resumo:
Introduction Different modalities of palliation for obstructive symptoms in patients with unresectable esophageal cancer (EC) exist. However, these therapeutic alternatives have significant differences in costs and effectiveness. Methods A Markov model was designed to compare the cost-effectiveness (CE) of self-expandable stent (SES), brachytherapy and laser in the palliation of unresectable EC. Patients were assigned to one of the strategies, and the improvement in swallowing function was compared given the treatment efficacy, probability of survival, and risks of complications associated to each strategy. Probabilities and parameters for distribution were based on a 9-month time frame. Results Under the base-case scenario, laser has the lowest CE ratio, followed by brachytherapy at an incremental cost-effectiveness ratio (ICER) of $4,400.00, and SES is a dominated strategy. In the probabilistic analysis, laser is the strategy with the highest probability of cost-effectiveness for willingness to pay (WTP) values lower than $3,201 and brachytherapy for all WTP yielding a positive net health benefit (NHB) (threshold $4,440). The highest probability of cost-effectiveness for brachytherapy is 96%, and consequently, selection of suboptimal strategies can lead to opportunity losses for the US health system, ranging from US$ 4.32 to US$ 38.09 million dollars over the next 5-20 years. Conclusion Conditional to the WTP and current US Medicare costs, palliation of unresectable esophageal cancers with brachytherapy provides the largest amount of NHB and is the strategy with the highest probability of CE. However, some level of uncertainly remains, and wrong decisions will be made until further knowledge is acquired.
Resumo:
The Madden-Julian oscillation (MJO) is the most prominent form of tropical intraseasonal variability. This study investigated the following questions. Do inter-annual-to-decadal variations in tropical sea surface temperature (SST) lead to substantial changes in MJO activity? Was there a change in the MJO in the 1970s? Can this change be associated to SST anomalies? What was the level of MJO activity in the pre-reanalysis era? These questions were investigated with a stochastic model of the MJO. Reanalysis data (1948-2008) were used to develop a nine-state first order Markov model capable to simulate the non-stationarity of the MJO. The model is driven by observed SST anomalies and a large ensemble of simulations was performed to infer the activity of the MJO in the instrumental period (1880-2008). The model is capable to reproduce the activity of the MJO during the reanalysis period. The simulations indicate that the MJO exhibited a regime of near normal activity in 1948-1972 (3.4 events year(-1)) and two regimes of high activity in 1973-1989 (3.9 events) and 1990-2008 (4.6 events). Stochastic simulations indicate decadal shifts with near normal levels in 1880-1895 (3.4 events), low activity in 1896 1917 (2.6 events) and a return to near normal levels during 1918-1947 (3.3 events). The results also point out to significant decadal changes in probabilities of very active years (5 or more MJO events): 0.214 (1880-1895), 0.076 (1896-1917), 0.197 (1918-1947) and 0.193 (1948-1972). After a change in behavior in the 1970s, this probability has increased to 0.329 (1973-1989) and 0.510 (1990-2008). The observational and stochastic simulations presented here call attention to the need to further understand the variability of the MJO on a wide range of time scales.
Resumo:
Robotic mapping is the process of automatically constructing an environment representation using mobile robots. We address the problem of semantic mapping, which consists of using mobile robots to create maps that represent not only metric occupancy but also other properties of the environment. Specifically, we develop techniques to build maps that represent activity and navigability of the environment. Our approach to semantic mapping is to combine machine learning techniques with standard mapping algorithms. Supervised learning methods are used to automatically associate properties of space to the desired classification patterns. We present two methods, the first based on hidden Markov models and the second on support vector machines. Both approaches have been tested and experimentally validated in two problem domains: terrain mapping and activity-based mapping.
Resumo:
A continuous version of the hierarchical spherical model at dimension d=4 is investigated. Two limit distributions of the block spin variable X(gamma), normalized with exponents gamma = d + 2 and gamma=d at and above the critical temperature, are established. These results are proven by solving certain evolution equations corresponding to the renormalization group (RG) transformation of the O(N) hierarchical spin model of block size L(d) in the limit L down arrow 1 and N ->infinity. Starting far away from the stationary Gaussian fixed point the trajectories of these dynamical system pass through two different regimes with distinguishable crossover behavior. An interpretation of this trajectories is given by the geometric theory of functions which describe precisely the motion of the Lee-Yang zeroes. The large-N limit of RG transformation with L(d) fixed equal to 2, at the criticality, has recently been investigated in both weak and strong (coupling) regimes by Watanabe (J. Stat. Phys. 115:1669-1713, 2004) . Although our analysis deals only with N = infinity case, it complements various aspects of that work.
Resumo:
We discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models. We generalize an earlier work, considering the sojourn times in health states are not identically distributed, for a given vector of covariates. Approaches based on semiparametric and parametric (exponential and Weibull distributions) methodologies are considered. A simulation study is conducted to evaluate the performance of the proposed estimator and the jackknife resampling method is used to estimate the variance of such estimator. An application to a real data set is also included.
Resumo:
The main goal of this paper is to establish some equivalence results on stability, recurrence, and ergodicity between a piecewise deterministic Markov process ( PDMP) {X( t)} and an embedded discrete-time Markov chain {Theta(n)} generated by a Markov kernel G that can be explicitly characterized in terms of the three local characteristics of the PDMP, leading to tractable criterion results. First we establish some important results characterizing {Theta(n)} as a sampling of the PDMP {X( t)} and deriving a connection between the probability of the first return time to a set for the discrete-time Markov chains generated by G and the resolvent kernel R of the PDMP. From these results we obtain equivalence results regarding irreducibility, existence of sigma-finite invariant measures, and ( positive) recurrence and ( positive) Harris recurrence between {X( t)} and {Theta(n)}, generalizing the results of [ F. Dufour and O. L. V. Costa, SIAM J. Control Optim., 37 ( 1999), pp. 1483-1502] in several directions. Sufficient conditions in terms of a modified Foster-Lyapunov criterion are also presented to ensure positive Harris recurrence and ergodicity of the PDMP. We illustrate the use of these conditions by showing the ergodicity of a capacity expansion model.
Resumo:
We consider a binary Bose-Einstein condensate (BEC) described by a system of two-dimensional (2D) Gross-Pitaevskii equations with the harmonic-oscillator trapping potential. The intraspecies interactions are attractive, while the interaction between the species may have either sign. The same model applies to the copropagation of bimodal beams in photonic-crystal fibers. We consider a family of trapped hidden-vorticity (HV) modes in the form of bound states of two components with opposite vorticities S(1,2) = +/- 1, the total angular momentum being zero. A challenging problem is the stability of the HV modes. By means of a linear-stability analysis and direct simulations, stability domains are identified in a relevant parameter plane. In direct simulations, stable HV modes feature robustness against large perturbations, while unstable ones split into fragments whose number is identical to the azimuthal index of the fastest growing perturbation eigenmode. Conditions allowing for the creation of the HV modes in the experiment are discussed too. For comparison, a similar but simpler problem is studied in an analytical form, viz., the modulational instability of an HV state in a one-dimensional (1D) system with periodic boundary conditions (this system models a counterflow in a binary BEC mixture loaded into a toroidal trap or a bimodal optical beam coupled into a cylindrical shell). We demonstrate that the stabilization of the 1D HV modes is impossible, which stresses the significance of the stabilization of the HV modes in the 2D setting.
Resumo:
We study a general stochastic rumour model in which an ignorant individual has a certain probability of becoming a stifler immediately upon hearing the rumour. We refer to this special kind of stifler as an uninterested individual. Our model also includes distinct rates for meetings between two spreaders in which both become stiflers or only one does, so that particular cases are the classical Daley-Kendall and Maki-Thompson models. We prove a Law of Large Numbers and a Central Limit Theorem for the proportions of those who ultimately remain ignorant and those who have heard the rumour but become uninterested in it.