938 resultados para Chaîne de Markov cachée
Resumo:
Hepatitis B is a worldwide health problem affecting about 2 billion people and more than 350 million are chronic carriers of the virus. Nine HBV genotypes (A to I) have been described. The geographical distribution of HBV genotypes is not completely understood due to the limited number of samples from some parts of the world. One such example is Colombia, in which few studies have described the HBV genotypes. In this study, we characterized HBV genotypes in 143 HBsAg-positive volunteer blood donors from Colombia. A fragment of 1306 bp partially comprising HBsAg and the DNA polymerase coding regions (S/POL) was amplified and sequenced. Bayesian phylogenetic analyses were conducted using the Markov Chain Monte Carlo (MCMC) approach to obtain the maximum clade credibility (MCC) tree using BEAST v.1.5.3. Of all samples, 68 were positive and 52 were successfully sequenced. Genotype F was the most prevalent in this population (77%) - subgenotypes F3 (75%) and Fib (2%). Genotype G (7.7%) and subgenotype A2 (15.3%) were also found. Genotype G sequence analysis suggests distinct introductions of this genotype in the country. Furthermore, we estimated the time of the most recent common ancestor (TMRCA) for each HBV/F subgenotype and also for Colombian F3 sequences using two different datasets: (i) 77 sequences comprising 1306 bp of S/POL region and (ii) 283 sequences comprising 681 bp of S/POL region. We also used two other previously estimated evolutionary rates: (i) 2.60 x 10(-4) s/s/y and (ii) 1.5 x 10(-5) s/s/y. Here we report the HBV genotypes circulating in Colombia and estimated the TMRCA for the four different subgenotypes of genotype F. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Hepatitis C virus (HCV) is a frequent cause of acute and chronic hepatitis and a leading cause for cirrhosis of the liver and hepatocellular carcinoma. HCV is classified in six major genotypes and more than 70 subtypes. In Colombian blood banks, serum samples were tested for anti-HCV antibodies using a third-generation ELISA. The aim of this study was to characterize the viral sequences in plasma of 184 volunteer blood donors who attended the ""Banco Nacional de Sangre de la Cruz Roja Colombiana,`` Bogota, Colombia. Three different HCV genomic regions were amplified by nested PCR. The first of these was a segment of 180 bp of the 5`UTR region to confirm the previous diagnosis by ELISA. From those that were positive to the 5`UTR region, two further segments were amplified for genotyping and subtyping by phylogenetic analysis: a segment of 380 bp from the NS5B region; and a segment of 391 bp from the E1 region. The distribution of HCV subtypes was: 1b (82.8%), 1a (5.7%), 2a (5.7%), 2b (2.8%), and 3a (2.8%). By applying Bayesian Markov chain Monte Carlo simulation, it was estimated that HCV-1b was introduced into Bogota around 1950. Also, this subtype spread at an exponential rate between about 1970 to about 1990, after which transmission of HCV was reduced by anti-HCV testing of this population. Among Colombian blood donors, HCV genotype 1b is the most frequent genotype, especially in large urban conglomerates such as Bogota, as is the case in other South American countries. J. Med. Virol. 82: 1889-1898, 2010. (C) 2010 Wiley-Liss, Inc.
Resumo:
Molecular epidemiological data concerning the hepatitis B virus (HBV) in Chile are not known completely. Since the HBV genotype F is the most prevalent in the country, the goal of this study was to obtain full HBV genome sequences from patients infected chronically in order to determine their subgenotypes and the occurrence of resistance-associated mutations. Twenty-one serum samples from antiviral drug-naive patients with chronic hepatitis B were subjected to full-length PCR amplification, and both strands of the whole genomes were fully sequenced. Phylogenetic analyses were performed along with reference sequences available from GenBank (n = 290). The sequences were aligned using Clustal X and edited in the SE-AL software. Bayesian phylogenetic analyses were conducted by Markov Chain Monte Carlo simulations (MCMC) for 10 million generations in order to obtain the substitution tree using BEAST. The sequences were also analyzed for the presence of primary drug resistance mutations using CodonCode Aligner Software. The phylogenetic analyses indicated that all sequences were found to be the HBV subgenotype F1b, clustered into four different groups, suggesting that diverse lineages of this subgenotype may be circulating within this population of Chilean patients. J. Med. Virol. 83: 1530-1536, 2011. (C) 2011 Wiley-Liss, Inc.
Resumo:
Background: At least for a subset of patients, the clinical diagnosis of mild cognitive impairment (MCI) may represent an intermediate stage between normal aging and dementia. Nevertheless, the patterns of transition of cognitive states between normal cognitive aging and MCI to dementia are not well established. In this study we address the pattern of transitions between cognitive states in patients with MCI and healthy controls, prior to the conversion to dementia. Methods: 139 subjects (78% women, mean age, 68.5 +/- 6.1 years; mean educational level, 11.7 +/- 5.4 years) were consecutively assessed in a memory clinic with a standardized clinical and neuropsychological protocol, and classified as cognitively healthy (normal controls) or with MCI (including subtypes) at baseline. These subjects underwent annual reassessments (mean duration of follow-up: 2.7 +/- 1.1 years), in which cognitive state was ascertained independently of prior diagnoses. The pattern of transitions of the cognitive state was determined by Markov chain analysis. Results: The transitions from one cognitive state to another varied substantially between MCI subtypes. Single-domain MCI (amnestic and non-amnestic) more frequently returned to normal cognitive state upon follow-up (22.5% and 21%, respectively). Among subjects who progressed to Alzheimer`s disease (AD), the most common diagnosis immediately prior conversion was multiple-domain MCI (85%). Conclusion: The clinical diagnosis of MCI and its subtypes yields groups of patients with heterogeneous patterns of transitions between one given cognitive state to another. The presence of more severe and widespread cognitive deficits, as indicated by the group of multiple-domain amnestic MCI may be a better predictor of AD than single-domain amnestic or non-amnestic deficits. These higher-risk individuals could probably be the best candidates for the development of preventive strategies and early treatment for the disease.
Resumo:
Introduction Different modalities of palliation for obstructive symptoms in patients with unresectable esophageal cancer (EC) exist. However, these therapeutic alternatives have significant differences in costs and effectiveness. Methods A Markov model was designed to compare the cost-effectiveness (CE) of self-expandable stent (SES), brachytherapy and laser in the palliation of unresectable EC. Patients were assigned to one of the strategies, and the improvement in swallowing function was compared given the treatment efficacy, probability of survival, and risks of complications associated to each strategy. Probabilities and parameters for distribution were based on a 9-month time frame. Results Under the base-case scenario, laser has the lowest CE ratio, followed by brachytherapy at an incremental cost-effectiveness ratio (ICER) of $4,400.00, and SES is a dominated strategy. In the probabilistic analysis, laser is the strategy with the highest probability of cost-effectiveness for willingness to pay (WTP) values lower than $3,201 and brachytherapy for all WTP yielding a positive net health benefit (NHB) (threshold $4,440). The highest probability of cost-effectiveness for brachytherapy is 96%, and consequently, selection of suboptimal strategies can lead to opportunity losses for the US health system, ranging from US$ 4.32 to US$ 38.09 million dollars over the next 5-20 years. Conclusion Conditional to the WTP and current US Medicare costs, palliation of unresectable esophageal cancers with brachytherapy provides the largest amount of NHB and is the strategy with the highest probability of CE. However, some level of uncertainly remains, and wrong decisions will be made until further knowledge is acquired.
Resumo:
Schistosoma mansoni is responsible for the neglected tropical disease schistosomiasis that affects 210 million people in 76 countries. Here we present analysis of the 363 megabase nuclear genome of the blood fluke. It encodes at least 11,809 genes, with an unusual intron size distribution, and new families of micro-exon genes that undergo frequent alternative splicing. As the first sequenced flatworm, and a representative of the Lophotrochozoa, it offers insights into early events in the evolution of the animals, including the development of a body pattern with bilateral symmetry, and the development of tissues into organs. Our analysis has been informed by the need to find new drug targets. The deficits in lipid metabolism that make schistosomes dependent on the host are revealed, and the identification of membrane receptors, ion channels and more than 300 proteases provide new insights into the biology of the life cycle and new targets. Bioinformatics approaches have identified metabolic chokepoints, and a chemogenomic screen has pinpointed schistosome proteins for which existing drugs may be active. The information generated provides an invaluable resource for the research community to develop much needed new control tools for the treatment and eradication of this important and neglected disease.
Resumo:
Surrogate methods for detecting lateral gene transfer are those that do not require inference of phylogenetic trees. Herein I apply four such methods to identify open reading frames (ORFs) in the genome of Escherichia coli K12 that may have arisen by lateral gene transfer. Only two of these methods detect the same ORFs more frequently than expected by chance, whereas several intersections contain many fewer ORFs than expected. Each of the four methods detects a different non-random set of ORFs. The methods may detect lateral ORFs of different relative ages; testing this hypothesis will require rigorous inference of trees. (C) 2001 Federation of European Microbiological Societies. Published by Elsevier Science BN. All rights reserved.
Resumo:
Applied econometricians often fail to impose economic regularity constraints in the exact form economic theory prescribes. We show how the Singular Value Decomposition (SVD) Theorem and Markov Chain Monte Carlo (MCMC) methods can be used to rigorously impose time- and firm-varying equality and inequality constraints. To illustrate the technique we estimate a system of translog input demand functions subject to all the constraints implied by economic theory, including observation-varying symmetry and concavity constraints. Results are presented in the form of characteristics of the estimated posterior distributions of functions of the parameters. Copyright (C) 2001 John Wiley Sons, Ltd.
Resumo:
The two-node tandem Jackson network serves as a convenient reference model for the analysis and testing of different methodologies and techniques in rare event simulation. In this paper we consider a new approach to efficiently estimate the probability that the content of the second buffer exceeds some high level L before it becomes empty, starting from a given state. The approach is based on a Markov additive process representation of the buffer processes, leading to an exponential change of measure to be used in an importance sampling procedure. Unlike changes of measures proposed and studied in recent literature, the one derived here is a function of the content of the first buffer. We prove that when the first buffer is finite, this method yields asymptotically efficient simulation for any set of arrival and service rates. In fact, the relative error is bounded independent of the level L; a new result which is not established for any other known method. When the first buffer is infinite, we propose a natural extension of the exponential change of measure for the finite buffer case. In this case, the relative error is shown to be bounded (independent of L) only when the second server is the bottleneck; a result which is known to hold for some other methods derived through large deviations analysis. When the first server is the bottleneck, experimental results using our method seem to suggest that the relative error is bounded linearly in L.
Resumo:
For Markov processes on the positive integers with the origin as an absorbing state, Ferrari, Kesten, Martinez and Picco studied the existence of quasi-stationary and limiting conditional distributions by characterizing quasi-stationary distributions as fixed points of a transformation Phi on the space of probability distributions on {1, 2,.. }. In the case of a birth-death process, the components of Phi(nu) can be written down explicitly for any given distribution nu. Using this explicit representation, we will show that Phi preserves likelihood ratio ordering between distributions. A conjecture of Kryscio and Lefevre concerning the quasi-stationary distribution of the SIS logistic epidemic follows as a corollary.
Resumo:
A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.
Resumo:
Many large-scale stochastic systems, such as telecommunications networks, can be modelled using a continuous-time Markov chain. However, it is frequently the case that a satisfactory analysis of their time-dependent, or even equilibrium, behaviour is impossible. In this paper, we propose a new method of analyzing Markovian models, whereby the existing transition structure is replaced by a more amenable one. Using rates of transition given by the equilibrium expected rates of the corresponding transitions of the original chain, we are able to approximate its behaviour. We present two formulations of the idea of expected rates. The first provides a method for analysing time-dependent behaviour, while the second provides a highly accurate means of analysing equilibrium behaviour. We shall illustrate our approach with reference to a variety of models, giving particular attention to queueing and loss networks. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
O principal objetivo deste trabalho foi identificar e caracterizar a evolução diária da Camada Limite Atmosférica (CLA) na Região da Grande Vitória (RGV), Estado do Espírito Santo, Brasil e na Região de Dunkerque (RD), Departamento Nord Pas-de-Calais, França, avaliando a acurácia de parametrizações usadas no modelo meteorológico Weather Research and Forecasting (WRF) em detectar a formação e atributos da Camada Limite Interna (CLI) que é formada pelas brisas marítimas. A RGV tem relevo complexo, em uma região costeira de topografia acidentada e uma cadeia de montanhas paralela à costa. A RD tem relevo simples, em uma região costeira com pequenas ondulações que não chegam a ultrapassar 150 metros, ao longo do domínio de estudos. Para avaliar os resultados dos prognósticos feitos pelo modelo, foram utilizados os resultados de duas campanhas: uma realizada na cidade de Dunkerque, no norte da França, em Julho de 2009, utilizando um sistema light detection and ranging (LIDAR), um sonic detection and ranging (SODAR) e dados de uma estação meteorológica de superfície (EMS); outra realizada na cidade de Vitória – Espírito Santo, no mês de julho de 2012, também usando um LIDAR, um SODAR e dados de uma EMS. Foram realizadas simulações usando três esquemas de parametrizações para a CLA, dois de fechamento não local, Yonsei University (YSU) e Asymmetric Convective Model 2 (ACM2) e um de fechamento local, Mellor Yamada Janjic (MYJ) e dois esquemas de camada superficial do solo (CLS), Rapid Update Cycle (RUC) e Noah. Tanto para a RGV quanto para a RD, foram feitas simulações com as seis possíveis combinações das três parametrizações de CLA e as duas de CLS, para os períodos em que foram feitas as campanhas, usando quatro domínios aninhados, sendo os três maiores quadrados com dimensões laterais de 1863 km, 891 km e 297 km, grades de 27 km, 9 km e 3 km, respectivamente, e o domínio de estudo, com dimensões de 81 km na direção Norte-Sul e 63 km na Leste-Oeste, grade de 1 km, com 55 níveis verticais, até um máximo de, aproximadamente, 13.400 m, mais concentrados próximos ao solo. Os resultados deste trabalho mostraram que: a) dependendo da configuração adotada, o esforço computacional pode aumentar demasiadamente, sem que ocorra um grande aumento na acurácia dos resultados; b) para a RD, a simulação usando o conjunto de parametrizações MYJ para a CLA com a parametrização Noah produziu a melhor estimativa captando os fenômenos da CLI. As simulações usando as parametrizações ACM2 e YSU inferiram a entrada da brisa com atraso de até três horas; c) para a RGV, a simulação que usou as parametrizações YSU para a CLA em conjunto com a parametrização Noah para CLS foi a que conseguiu fazer melhores inferências sobre a CLI. Esses resultados sugerem a necessidade de avaliações prévias do esforço computacional necessário para determinadas configurações, e sobre a acurácia de conjuntos de parametrizações específicos para cada região pesquisada. As diferenças estão associadas com a capacidade das diferentes parametrizações em captar as informações superficiais provenientes das informações globais, essenciais para determinar a intensidade de mistura turbulenta vertical e temperatura superficial do solo, sugerindo que uma melhor representação do uso de solo é fundamental para melhorar as estimativas sobre a CLI e demais parâmetros usados por modelos de dispersão de poluentes atmosféricos.
Resumo:
O objetivo deste estudo foi determinar as probabilidades de ocorrência de períodos secos e chuvosos na região de Sete Lagoas, MG, a partir de uma série de 66 anos de dados diários de precipitação pluvial, visando subsidiar a definição da melhor data de semeadura do milho. Foram considerados dias secos aqueles que apresentaram precipitação inferior à evapotranspiração do milho, ETmilho. O estudo foi realizado para as fases de floração e enchimento de grãos a partir de sete datas de semeadura DS (01/10, 16/10, 31/10, 15/11, 01/12, 16/12 e 31/12). As chances de ocorrência dos períodos secos e chuvosos foram estimadas mediante o uso da cadeia de Markov. A probabilidade de ocorrência de dias secos foi sempre superior à de dias chuvosos. As maiores possibilidades de ocorrência de dias secos foram observadas entre as DS 15/11 e 31/12. A maior probabilidade de ocorrência de dias chuvosos foi registrada na DS 01/10. Considerando o ciclo médio estudado (para a fase mais crítica do milho), a combinação de menor chance de períodos secos com a de dias chuvosos indica que as melhores datas para iniciar a semeadura de sequeiro seriam as de DS 01/10 e 16/10.