967 resultados para Markov chains


Relevância:

60.00% 60.00%

Publicador:

Resumo:

No mundo, as hepatites decorrentes de infecções virais têm sido uma das grandes preocupações em saúde pública devido a seu caráter crônico, curso assintomático e pela sua capacidade de determinar a perda da função hepática. Com o uso em larga escala de medicamentos antirretrovirais, a doença hepática relacionada à infecção pelo vírus da hepatite C (VHC) contribuiu para uma mudança radical na história natural da infecção pelo vírus da imunodeficiência humana (HIV). Não se sabe ao certo o peso da coinfecção VHC/HIV no Brasil, mas evidências apontam que independentemente da região geográfica, esses indivíduos apresentam maiores dificuldades em eliminar o VHC após o tratamento farmacológico, quando comparados a monoinfectados. No âmbito do SUS, o tratamento antiviral padrão para portadores do genótipo 1 do VHC e do HIV é a administração de peguinterferon associado à Ribavirina. Quanto ao período de tratamento e aos indivíduos que devem ser incluídos, os dois protocolos terapêuticos mais recentes possuem divergências. A diretriz mais atual preconiza o tratamento de indivíduos respondedores precoces somados a respondedores virológicos lentos, enquanto a diretriz imediatamente anterior exclui na 12 semana indivíduos que não respondem completamente. Com base nessa divergência, esse estudo objetivou avaliar o custo-efetividade do tratamento contra o VHC em indivíduos portadores do genótipo 1, coinfectados com o HIV, virgens de tratamento antiviral, não cirróticos e imunologicamente estabilizados, submetidos às regras de tratamento antiviral estabelecidos pelas duas mais recentes diretrizes terapêuticas direcionadas ao atendimento pelo SUS. Para tal, foi elaborado um modelo matemático de decisão, baseado em cadeias de Markov, que simulou a progressão da doença hepática mediante o tratamento e não tratamento. Foi acompanhada uma coorte hipotética de mil indivíduos homens, maiores de 40 anos. Adotou-se a perspectiva do Sistema Único de Saúde, horizonte temporal de 30 anos e taxa de desconto de 5% para os custos e consequências clínicas. A extensão do tratamento para respondedores lentos proporcionou incremento de 0,28 anos de vida ajustados por qualidade (QALY), de 7% de sobrevida e aumento de 60% no número de indivíduos que eliminaram o VHC. Além dos esperados benefícios em eficácia, a inclusão de respondedores virológicos lentos mostrou-se uma estratégia custo-efetiva ao alcançar um incremental de custo efetividade de R$ 44.171/QALY, valor abaixo do limiar de aceitabilidade proposto pela Organização Mundial da Saúde OMS - (R$ 63.756,00/QALY). A análise de sensibilidade demonstrou que as possíveis incertezas contidas no modelo são incapazes de alterar o resultado final, evidenciando, assim, a robustez da análise. A inclusão de indivíduos coinfectados VHC/HIV respondedores virológicos lentos no protocolo de tratamento apresenta-se, do ponto de vista fármaco-econômico, como uma estratégia com relação de custoefetividade favorável para o Sistema Único de Saúde. Sua adoção é perfeitamente compatível com a perspectiva do sistema, ao retornar melhores resultados em saúdeassociados a custos abaixo de um teto orçamentário aceitável, e com o da sociedade, ao evitar em maior grau, complicações e internações quando comparado à não inclusão.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Espectrometria de Massa em Tandem (MS/MS) é mundialmente considerada padrão ouro para a Triagem Neonatal (TN) de Erros Inatos do Metabolismo (IEM). Além de apresentar melhor sensibilidade e especificidade possibilita rastrear uma vasta gama de IEM usando um único teste. Atualmente o Programa Nacional de Triagem Neonatal (PNTN) rastreia cinco doenças (Fenilcetonúria, Hipotiroidismo Congênito, Fibrose Cística, Hemoglobinopatias e Deficiência da Biotinidase). Uma das metas do PNTN é o aprimoramento e a incorporação de novas doenças e/ou tecnologias. Com a recente recomendação da CONITEC (Comissão Nacional de Incorporação de Tecnologias) para aquisição do MS/MS para diagnóstico de doenças raras, vislumbra-se o incremento desta tecnologia para ampliação de doenças triadas, melhora da qualidade do teste diagnóstico, corroborando para melhorar qualidade de vida das crianças acometidas pelos EIM. Este trabalho teve como objetivo realizar uma análise de custo efetividade, para incorporação da tecnologia de tandem MS/MS na triagem neonatal, sob a perspectiva do SUS. Desta maneira buscou-se comparar diferentes cenários da TN com a tecnologia atualmente utilizada (Fluorimetria) somente para Fenilcetonúria (PKU), e com MS/MS para rastreio da PKU e da Deficiência de Cadeia Média Acyl-Coenzima Desidrogenase (MCAD). Para tanto construiu-se um modelo matemático de decisão baseados em cadeias de Markov que simulou a TN da PKU e da MCAD, bem como a história natural da MCAD. Foi acompanhada uma coorte hipotética de cem mil recém-nascidos. O horizonte temporal adotado foi a expectativa de vida da população brasileira de 78 anos de acordo com IBGE. Utilizou-se uma taxa de desconto de 5% para os custos e consequências clínicas para ambos os cenários propostos. Quando incorporado o MS/MS para triagem da PKU os ganhos em saúde continuaram os mesmos, pois o desempenho do MS/MS e da Fluorimetria foram praticamente iguais (efetividade), porém o custo incremental foi quatro vezes maior para a mesma efetividade, o que torna o MS/MS somente para PKU não custo efetiva (dominada). No entanto, quando analisado o cenário do MS/MS para triagem da PKU e da MCAD o custo incremental do MS/MS no PNTN foi menor por causa da economia feita uma vez que é possível realizar ambos os testes no mesmo o teste do pezinho atual.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Copyright 2014 by the author(s). We present a nonparametric prior over reversible Markov chains. We use completely random measures, specifically gamma processes, to construct a countably infinite graph with weighted edges. By enforcing symmetry to make the edges undirected we define a prior over random walks on graphs that results in a reversible Markov chain. The resulting prior over infinite transition matrices is closely related to the hierarchical Dirichlet process but enforces reversibility. A reinforcement scheme has recently been proposed with similar properties, but the de Finetti measure is not well characterised. We take the alternative approach of explicitly constructing the mixing measure, which allows more straightforward and efficient inference at the cost of no longer having a closed form predictive distribution. We use our process to construct a reversible infinite HMM which we apply to two real datasets, one from epigenomics and one ion channel recording.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The computational detection of regulatory elements in DNA is a difficult but important problem impacting our progress in understanding the complex nature of eukaryotic gene regulation. Attempts to utilize cross-species conservation for this task have been hampered both by evolutionary changes of functional sites and poor performance of general-purpose alignment programs when applied to non-coding sequence. We describe a new and flexible framework for modeling binding site evolution in multiple related genomes, based on phylogenetic pair hidden Markov models which explicitly model the gain and loss of binding sites along a phylogeny. We demonstrate the value of this framework for both the alignment of regulatory regions and the inference of precise binding-site locations within those regions. As the underlying formalism is a stochastic, generative model, it can also be used to simulate the evolution of regulatory elements. Our implementation is scalable in terms of numbers of species and sequence lengths and can produce alignments and binding-site predictions with accuracy rivaling or exceeding current systems that specialize in only alignment or only binding-site prediction. We demonstrate the validity and power of various model components on extensive simulations of realistic sequence data and apply a specific model to study Drosophila enhancers in as many as ten related genomes and in the presence of gain and loss of binding sites. Different models and modeling assumptions can be easily specified, thus providing an invaluable tool for the exploration of biological hypotheses that can drive improvements in our understanding of the mechanisms and evolution of gene regulation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we discuss the relationship and characterization of stochastic comparability, duality, and Feller–Reuter–Riley transition functions which are closely linked with each other for continuous time Markov chains. A necessary and sufficient condition for two Feller minimal transition functions to be stochastically comparable is given in terms of their density q-matrices only. Moreover, a necessary and sufficient condition under which a transition function is a dual for some stochastically monotone q-function is given in terms of, again, its density q-matrix. Finally, for a class of q-matrices, the necessary and sufficient condition for a transition function to be a Feller–Reuter–Riley transition function is also given.

Relevância:

60.00% 60.00%

Publicador:

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The key problems in discussing duality and monotonicity for continuous-time Markov chains are to find conditions for existence and uniqueness and then to construct corresponding processes in terms of infinitesimal characteristics, i.e., q-matrices. Such problems are solved in this paper under the assumption that the given q-matrix is conservative. Some general properties of stochastically monotone Q-process ( Q is not necessarily conservative) are also discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To simultaneously evaluate 14 biomarkers from distinct biological pathways for risk prediction of ischemic stroke, including biomarkers of hemostasis, inflammation, and endothelial activation as well as chemokines and adipocytokines.
Methods and Results: The Prospective Epidemiological Study on Myocardial Infarction (PRIME) is a cohort of 9771 healthy men 50 to 59 years of age who were followed up over 10 years. In a nested case–control study, 95 ischemic stroke cases were matched with 190 controls. After multivariable adjustment for traditional risk factors, fibrinogen (odds ratio [OR], 1.53; 95% confidence interval [CI], 1.03–2.28), E-selectin (OR, 1.76; 95% CI, 1.06–2.93), interferon-γ-inducible-protein-10 (OR, 1.72; 95% CI, 1.06–2.78), resistin (OR, 2.86; 95% CI, 1.30–6.27), and total adiponectin (OR, 1.82; 95% CI, 1.04–3.19) were significantly associated with ischemic stroke. Adding E-selectin and resistin to a traditional risk factor model significantly increased the area under the receiver-operating characteristic curve from 0.679 (95% CI, 0.612–0.745) to 0.785 and 0.788, respectively, and yielded a categorical net reclassification improvement of 29.9% (P=0.001) and 28.4% (P=0.002), respectively. Their simultaneous inclusion in the traditional risk factor model increased the area under the receiver-operating characteristic curve to 0.824 (95% CI, 0.770–0.877) and resulted in an net reclassification improvement of 41.4% (P<0.001). Results were confirmed when using continuous net reclassification improvement.
Conclusion: Among multiple biomarkers from distinct biological pathways, E-selectin and resistin provided incremental and additive value to traditional risk factors in predicting ischemic stroke.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the last decade, mobile phones and mobile devices using mobile cellular telecommunication network connections have become ubiquitous. In several developed countries, the penetration of such devices has surpassed 100 percent. They facilitate communication and access to large quantities of data without the requirement of a fixed location or connection. Assuming mobile phones usually are in close proximity with the user, their cellular activities and locations are indicative of the user's activities and movements. As such, those cellular devices may be considered as a large scale distributed human activity sensing platform. This paper uses mobile operator telephony data to visualize the regional flows of people across the Republic of Ireland. In addition, the use of modified Markov chains for the ranking of significant regions of interest to mobile subscribers is investigated. Methodology is then presented which demonstrates how the ranking of significant regions of interest may be used to estimate national population, results of which are found to have strong correlation with census data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An experimental study aimed at assessing the influence of redundancy and neutrality on the performance of an (1+1)-ES evolution strategy modeled using Markov chains and applied to NK fitness landscapes is presented. For the study, two families of redundant binary representations, one non-neutral family which is based on linear transformations and that allows the phenotypic neighborhoods to be designed in a simple and effective way, and the neutral family based on the mathematical formulation of error control codes are used. The results indicate whether redundancy or neutrality affects more strongly the behavior of the algorithm used.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Lipid-lowering therapy is costly but effective at reducing coronary heart disease (CHD) risk. OBJECTIVE: To assess the cost-effectiveness and public health impact of Adult Treatment Panel III (ATP III) guidelines and compare with a range of risk- and age-based alternative strategies. DESIGN: The CHD Policy Model, a Markov-type cost-effectiveness model. DATA SOURCES: National surveys (1999 to 2004), vital statistics (2000), the Framingham Heart Study (1948 to 2000), other published data, and a direct survey of statin costs (2008). TARGET POPULATION: U.S. population age 35 to 85 years. Time Horizon: 2010 to 2040. PERSPECTIVE: Health care system. INTERVENTION: Lowering of low-density lipoprotein cholesterol with HMG-CoA reductase inhibitors (statins). OUTCOME MEASURE: Incremental cost-effectiveness. RESULTS OF BASE-CASE ANALYSIS: Full adherence to ATP III primary prevention guidelines would require starting (9.7 million) or intensifying (1.4 million) statin therapy for 11.1 million adults and would prevent 20,000 myocardial infarctions and 10,000 CHD deaths per year at an annual net cost of $3.6 billion ($42,000/QALY) if low-intensity statins cost $2.11 per pill. The ATP III guidelines would be preferred over alternative strategies if society is willing to pay $50,000/QALY and statins cost $1.54 to $2.21 per pill. At higher statin costs, ATP III is not cost-effective; at lower costs, more liberal statin-prescribing strategies would be preferred; and at costs less than $0.10 per pill, treating all persons with low-density lipoprotein cholesterol levels greater than 3.4 mmol/L (>130 mg/dL) would yield net cost savings. RESULTS OF SENSITIVITY ANALYSIS: Results are sensitive to the assumptions that LDL cholesterol becomes less important as a risk factor with increasing age and that little disutility results from taking a pill every day. LIMITATION: Randomized trial evidence for statin effectiveness is not available for all subgroups. CONCLUSION: The ATP III guidelines are relatively cost-effective and would have a large public health impact if implemented fully in the United States. Alternate strategies may be preferred, however, depending on the cost of statins and how much society is willing to pay for better health outcomes. FUNDING: Flight Attendants' Medical Research Institute and the Swanson Family Fund. The Framingham Heart Study and Framingham Offspring Study are conducted and supported by the National Heart, Lung, and Blood Institute.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper develops a general stochastic framework and an equilibrium asset pricing model that make clear how attitudes towards intertemporal substitution and risk matter for option pricing. In particular, we show under which statistical conditions option pricing formulas are not preference-free, in other words, when preferences are not hidden in the stock and bond prices as they are in the standard Black and Scholes (BS) or Hull and White (HW) pricing formulas. The dependence of option prices on preference parameters comes from several instantaneous causality effects such as the so-called leverage effect. We also emphasize that the most standard asset pricing models (CAPM for the stock and BS or HW preference-free option pricing) are valid under the same stochastic setting (typically the absence of leverage effect), regardless of preference parameter values. Even though we propose a general non-preference-free option pricing formula, we always keep in mind that the BS formula is dominant both as a theoretical reference model and as a tool for practitioners. Another contribution of the paper is to characterize why the BS formula is such a benchmark. We show that, as soon as we are ready to accept a basic property of option prices, namely their homogeneity of degree one with respect to the pair formed by the underlying stock price and the strike price, the necessary statistical hypotheses for homogeneity provide BS-shaped option prices in equilibrium. This BS-shaped option-pricing formula allows us to derive interesting characterizations of the volatility smile, that is, the pattern of BS implicit volatilities as a function of the option moneyness. First, the asymmetry of the smile is shown to be equivalent to a particular form of asymmetry of the equivalent martingale measure. Second, this asymmetry appears precisely when there is either a premium on an instantaneous interest rate risk or on a generalized leverage effect or both, in other words, whenever the option pricing formula is not preference-free. Therefore, the main conclusion of our analysis for practitioners should be that an asymmetric smile is indicative of the relevance of preference parameters to price options.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper studies the transition between exchange rate regimes using a Markov chain model with time-varying transition probabilities. The probabilities are parameterized as nonlinear functions of variables suggested by the currency crisis and optimal currency area literature. Results using annual data indicate that inflation, and to a lesser extent, output growth and trade openness help explain the exchange rate regime transition dynamics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Les titres financiers sont souvent modélisés par des équations différentielles stochastiques (ÉDS). Ces équations peuvent décrire le comportement de l'actif, et aussi parfois certains paramètres du modèle. Par exemple, le modèle de Heston (1993), qui s'inscrit dans la catégorie des modèles à volatilité stochastique, décrit le comportement de l'actif et de la variance de ce dernier. Le modèle de Heston est très intéressant puisqu'il admet des formules semi-analytiques pour certains produits dérivés, ainsi qu'un certain réalisme. Cependant, la plupart des algorithmes de simulation pour ce modèle font face à quelques problèmes lorsque la condition de Feller (1951) n'est pas respectée. Dans ce mémoire, nous introduisons trois nouveaux algorithmes de simulation pour le modèle de Heston. Ces nouveaux algorithmes visent à accélérer le célèbre algorithme de Broadie et Kaya (2006); pour ce faire, nous utiliserons, entre autres, des méthodes de Monte Carlo par chaînes de Markov (MCMC) et des approximations. Dans le premier algorithme, nous modifions la seconde étape de la méthode de Broadie et Kaya afin de l'accélérer. Alors, au lieu d'utiliser la méthode de Newton du second ordre et l'approche d'inversion, nous utilisons l'algorithme de Metropolis-Hastings (voir Hastings (1970)). Le second algorithme est une amélioration du premier. Au lieu d'utiliser la vraie densité de la variance intégrée, nous utilisons l'approximation de Smith (2007). Cette amélioration diminue la dimension de l'équation caractéristique et accélère l'algorithme. Notre dernier algorithme n'est pas basé sur une méthode MCMC. Cependant, nous essayons toujours d'accélérer la seconde étape de la méthode de Broadie et Kaya (2006). Afin de réussir ceci, nous utilisons une variable aléatoire gamma dont les moments sont appariés à la vraie variable aléatoire de la variance intégrée par rapport au temps. Selon Stewart et al. (2007), il est possible d'approximer une convolution de variables aléatoires gamma (qui ressemble beaucoup à la représentation donnée par Glasserman et Kim (2008) si le pas de temps est petit) par une simple variable aléatoire gamma.