883 resultados para Hidden Markov Model
Resumo:
El carcinoma Hepatocelular (HCC) representa la sexta causa más frecuente de cáncer, y la tercera causa de muerte relacionada con cáncer en el mundo con aproximadamente 600.000 muertes anuales. En el 70 % de los casos, este se desarrolla en presencia de una enfermedad crónica del hígado como la cirrosis u otras enfermedades inflamatorias, por lo que practicar métodos de tamizaje para su diagnóstico precoz, pudieran establecer un mejor pronóstico. El objetivo de este trabajo es diseñar una vía clínica capaz de homogenizar el proceso de tamizaje de HCC, soportando su realización con la realización de una evaluación económica de esta intervención. Se realiza una búsqueda sistemática de literatura y se propone una vía clínica para la vigilancia de HCC en Colombia. A esta propuesta se aplica una evaluación económica tipo costo-efectividad mediante un modelo de Markov de la intervención propuesta, comparando la aplicación de la vía clínica propuesta frente al manejo actual en 100 pacientes considerados con riesgo (cirrosis, portadores de HBV y/o portadores de HCV) con un horizonte de tiempo de 30 años analizando como desenlace los años de vida salvados (LYS) desde la perspectiva del tercero pagador para Colombia a precios de 2009. El análisis determina una disminución de la mortalidad en un 40%, y un valor ICER de US$ 1,438 por LYS, por lo cual se concluye que resulta costo efectivo la aplicación de esta propuesta de tamizaje. Es necesario realizar una prueba para la aplicación de la vía clínica.
Resumo:
Introducción: el dolor neuropático es una patología de considerable prevalencia e impacto socio-económico en la población latinoamericana, la evidencia clínica sugiere que los ligandos de canales de calcio y el parche de Lidocaína pueden tratar exitosamente el dolor neuropático periférico y localizado. Metodología: se realizo una evaluación económica tipo costo-efectividad, observacional y retrospectiva con datos extraídos de las historias clínicas de pacientes atendidos en la clínica de dolor de la IPS. La variable primaria de efectividad fue la mejoría del dolor medida mediante escala visual análoga. Resultados: se estudiaron 94 pacientes tratados con: Gabapentina (G) 21, Pregabalina (P) 24, Gabapentina+ lidocaína (G/P) 24, Pregabalina + Lidocaína (P/L) 25, los costos asociados al tratamiento son los siguientes COP$114.070.835, COP$105.855.920, COP$88.717.481 COP$89.854.712 respectivamente, el número de pacientes con mejoría significativa de dolor fue: 8,10,9 y 21 pacientes respectivamente. El ICER de G/L con respecto a G fue: COP$ -25.353.354. El ICER de P/L con respecto a P fue: COP$ -1.454.655. Conclusiones: la adición del parche de lidocaína a la terapia regular con P/L represento una disminución de consumo de recursos en salud como uso de medicamentos co-analgésicos, analgésicos de rescate y fármacos para controlar reacciones adversas, de la misma forma que consultas a profesionales de la salud. Cada paciente manejado con P/L representa un ahorro de COP $1.454.655 al contrario si se manejase con el anticonvulsivante de manera exclusiva, en el caso de G/L este ahorro es de COP $ 25.353.354 frente a G sola.
Resumo:
Background Plasmodium vivax continues to be the most widely distributed malarial parasite species in tropical and sub-tropical areas, causing high morbidity indices around the world. Better understanding of the proteins used by the parasite during the invasion of red blood cells is required to obtain an effective vaccine against this disease. This study describes characterizing the P. vivax asparagine-rich protein (PvARP) and examines its antigenicity in natural infection. Methods The target gene in the study was selected according to a previous in silico analysis using profile hidden Markov models which identified P. vivax proteins that play a possible role in invasion. Transcription of the arp gene in the P. vivax VCG-1 strain was here evaluated by RT-PCR. Specific human antibodies against PvARP were used to confirm protein expression by Western blot as well as its subcellular localization by immunofluorescence. Recognition of recombinant PvARP by sera from P. vivax-infected individuals was evaluated by ELISA. Results VCG-1 strain PvARP is a 281-residue-long molecule, which is encoded by a single exon and has an N-terminal secretion signal, as well as a tandem repeat region. This protein is expressed in mature schizonts and is located on the surface of merozoites, having an apparent accumulation towards their apical pole. Sera from P. vivax-infected patients recognized the recombinant, thereby suggesting that this protein is targeted by the immune response during infection.
Resumo:
The frequency of persistent atmospheric blocking events in the 40-yr ECMWF Re-Analysis (ERA-40) is compared with the blocking frequency produced by a simple first-order Markov model designed to predict the time evolution of a blocking index [defined by the meridional contrast of potential temperature on the 2-PVU surface (1 PVU ≡ 1 × 10−6 K m2 kg−1 s−1)]. With the observed spatial coherence built into the model, it is able to reproduce the main regions of blocking occurrence and the frequencies of sector blocking very well. This underlines the importance of the climatological background flow in determining the locations of high blocking occurrence as being the regions where the mean midlatitude meridional potential vorticity (PV) gradient is weak. However, when only persistent blocking episodes are considered, the model is unable to simulate the observed frequencies. It is proposed that this persistence beyond that given by a red noise model is due to the self-sustaining nature of the blocking phenomenon.
Resumo:
This study sets out to find the best calving pattern for small-scale dairy systems in Michoacan State, central Mexico. Two models were built. First, a linear programming model was constructed to optimize calving pattern and herd structure according to metabolizable energy availability. Second, a Markov chain model was built to investigate three reproductive scenarios (good, average and poor) in order to suggest factors that maintain the calving pattern given by the linear programming model. Though it was not possible to maintain the optimal linear programming pattern, the Markov chain model suggested adopting different reproduction strategies according to period of the year that the cow is expected to calve. Comparing different scenarios, the Markov model indicated the effect of calving interval on calving pattern and herd structure.
Resumo:
Numerous techniques exist which can be used for the task of behavioural analysis and recognition. Common amongst these are Bayesian networks and Hidden Markov Models. Although these techniques are extremely powerful and well developed, both have important limitations. By fusing these techniques together to form Bayes-Markov chains, the advantages of both techniques can be preserved, while reducing their limitations. The Bayes-Markov technique forms the basis of a common, flexible framework for supplementing Markov chains with additional features. This results in improved user output, and aids in the rapid development of flexible and efficient behaviour recognition systems.
Resumo:
The dynamics of inter-regional communication within the brain during cognitive processing – referred to as functional connectivity – are investigated as a control feature for a brain computer interface. EMDPL is used to map phase synchronization levels between all channel pair combinations in the EEG. This results in complex networks of channel connectivity at all time–frequency locations. The mean clustering coefficient is then used as a descriptive feature encapsulating information about inter-channel connectivity. Hidden Markov models are applied to characterize and classify dynamics of the resulting complex networks. Highly accurate levels of classification are achieved when this technique is applied to classify EEG recorded during real and imagined single finger taps. These results are compared to traditional features used in the classification of a finger tap BCI demonstrating that functional connectivity dynamics provide additional information and improved BCI control accuracies.
Resumo:
Dynamic soundtracking presents various practical and aesthetic challenges to composers working with games. This paper presents an implementation of a system addressing some of these challenges with an affectively-driven music generation algorithm based on a second order Markov-model. The system can respond in real-time to emotional trajectories derived from 2-dimensions of affect on the circumplex model (arousal and valence), which are mapped to five musical parameters. A transition matrix is employed to vary the generated output in continuous response to the affective state intended by the gameplay.
Resumo:
The Madden-Julian oscillation (MJO) is the most prominent form of tropical intraseasonal variability. This study investigated the following questions. Do inter-annual-to-decadal variations in tropical sea surface temperature (SST) lead to substantial changes in MJO activity? Was there a change in the MJO in the 1970s? Can this change be associated to SST anomalies? What was the level of MJO activity in the pre-reanalysis era? These questions were investigated with a stochastic model of the MJO. Reanalysis data (1948-2008) were used to develop a nine-state first order Markov model capable to simulate the non-stationarity of the MJO. The model is driven by observed SST anomalies and a large ensemble of simulations was performed to infer the activity of the MJO in the instrumental period (1880-2008). The model is capable to reproduce the activity of the MJO during the reanalysis period. The simulations indicate that the MJO exhibited a regime of near normal activity in 1948-1972 (3.4 events year(-1)) and two regimes of high activity in 1973-1989 (3.9 events) and 1990-2008 (4.6 events). Stochastic simulations indicate decadal shifts with near normal levels in 1880-1895 (3.4 events), low activity in 1896 1917 (2.6 events) and a return to near normal levels during 1918-1947 (3.3 events). The results also point out to significant decadal changes in probabilities of very active years (5 or more MJO events): 0.214 (1880-1895), 0.076 (1896-1917), 0.197 (1918-1947) and 0.193 (1948-1972). After a change in behavior in the 1970s, this probability has increased to 0.329 (1973-1989) and 0.510 (1990-2008). The observational and stochastic simulations presented here call attention to the need to further understand the variability of the MJO on a wide range of time scales.
Resumo:
Robotic mapping is the process of automatically constructing an environment representation using mobile robots. We address the problem of semantic mapping, which consists of using mobile robots to create maps that represent not only metric occupancy but also other properties of the environment. Specifically, we develop techniques to build maps that represent activity and navigability of the environment. Our approach to semantic mapping is to combine machine learning techniques with standard mapping algorithms. Supervised learning methods are used to automatically associate properties of space to the desired classification patterns. We present two methods, the first based on hidden Markov models and the second on support vector machines. Both approaches have been tested and experimentally validated in two problem domains: terrain mapping and activity-based mapping.
Resumo:
The goal of this paper is to show the possibility of a non-monotone relation between coverage ans risk which has been considered in the literature of insurance models since the work of Rothschild and Stiglitz (1976). We present an insurance model where the insured agents have heterogeneity in risk aversion and in lenience (a prevention cost parameter). Risk aversion is described by a continuous parameter which is correlated with lenience and for the sake of simplicity, we assume perfect correlation. In the case of positive correlation, the more risk averse agent has higher cosr of prevention leading to a higher demand for coverage. Equivalently, the single crossing property (SCP) is valid and iplies a positive correlation between overage and risk in equilibrium. On the other hand, if the correlation between risk aversion and lenience is negative, not only may the SCP be broken, but also the monotonocity of contracts, i.e., the prediction that high (low) risk averse types choose full (partial) insurance. In both cases riskiness is monotonic in risk aversion, but in the last case there are some coverage levels associated with two different risks (low and high), which implies that the ex-ante (with respect to the risk aversion distribution) correlation between coverage and riskiness may have every sign (even though the ex-post correlation is always positive). Moreover, using another instrument (a proxy for riskiness), we give a testable implication to desentangle single crossing ans non single croosing under an ex-post zero correlation result: the monotonicity of coverage as a function os riskiness. Since by controlling for risk aversion (no asymmetric information), coverage is monotone function of riskiness, this also fives a test for asymmetric information. Finally, we relate this theoretical results to empirical tests in the recent literature, specially the Dionne, Gouruéroux and Vanasse (2001) work. In particular, they found an empirical evidence that seems to be compatible with asymmetric information and non single crossing in our framework. More generally, we build a hidden information model showing how omitted variables (asymmetric information) can bias the sign of the correlation of equilibrium variables conditioning on all observable variables. We show that this may be the case when the omitted variables have a non-monotonic relation with the observable ones. Moreover, because this non-dimensional does not capture this deature. Hence, our main results is to point out the importance of the SPC in testing predictions of the hidden information models.
Resumo:
The goal of t.his paper is to show the possibility of a non-monot.one relation between coverage and risk which has been considered in the literature of insurance models since the work of Rothschild and Stiglitz (1976). We present an insurance model where the insured agents have heterogeneity in risk aversion and in lenience (a prevention cost parameter). Risk aversion is described by a continuou.'l parameter which is correlated with lenience and, for the sake of simplicity, we assume perfect correlation. In the case of positive correlation, the more risk averse agent has higher cost of prevention leading to a higher demand for coverage. Equivalently, the single crossing property (SCP) is valid and implies a positive correlation between coverage and risk in equilibrium. On the other hand, if the correlation between risk aversion and lenience is negative, not only may the sep be broken, but also the monotonicity of contracts, i.e., the prediction that high (Iow) risk averse types choose full (partial) insurance. In both cases riskiness is monotonic in risk aversion, but in the last case t,here are some coverage leveIs associated with two different risks (low and high), which implies that the ex-ante (with respect to the risk aversion distribution) correlation bet,ween coverage and riskiness may have every sign (even though the ex-post correlation is always positive). Moreover, using another instrument (a proxy for riskiness), we give a testable implication to disentangle single crossing and non single crossing under an ex-post zero correlation result: the monotonicity of coverage as a function of riskiness. Since by controlling for risk aversion (no asymmetric informat, ion), coverage is a monotone function of riskiness, this also gives a test for asymmetric information. Finally, we relate this theoretical results to empirica! tests in the recent literature, specially the Dionne, Gouriéroux and Vanasse (2001) work. In particular, they found an empirical evidence that seems to be compatible with asymmetric information and non single crossing in our framework. More generally, we build a hidden information model showing how omitted variabIes (asymmetric information) can bias the sign of the correlation of equilibrium variabIes conditioning on ali observabIe variabIes. We show that this may be t,he case when the omitted variabIes have a non-monotonic reIation with t,he observable ones. Moreover, because this non-monotonic reIat,ion is deepIy reIated with the failure of the SCP in one-dimensional screening problems, the existing lit.erature on asymmetric information does not capture t,his feature. Hence, our main result is to point Out the importance of t,he SCP in testing predictions of the hidden information models.
Resumo:
Traditionally, ancillary services are supplied by large conventional generators. However, with the huge penetration of distributed generators (DGs) as a result of the growing interest in satisfying energy requirements, and considering the benefits that they can bring along to the electrical system and to the environment, it appears reasonable to assume that ancillary services could also be provided by DGs in an economical and efficient way. In this paper, a settlement procedure for a reactive power market for DGs in distribution systems is proposed. Attention is directed to wind turbines connected to the network through synchronous generators with permanent magnets and doubly-fed induction generators. The generation uncertainty of this kind of DG is reduced by running a multi-objective optimization algorithm in multiple probabilistic scenarios through the Monte Carlo method and by representing the active power generated by the DGs through Markov models. The objectives to be minimized are the payments of the distribution system operator to the DGs for reactive power, the curtailment of transactions committed in an active power market previously settled, the losses in the lines of the network, and a voltage profile index. The proposed methodology was tested using a modified IEEE 37-bus distribution test system. © 1969-2012 IEEE.
Resumo:
O processamento de voz tornou-se uma tecnologia cada vez mais baseada na modelagem automática de vasta quantidade de dados. Desta forma, o sucesso das pesquisas nesta área está diretamente ligado a existência de corpora de domínio público e outros recursos específicos, tal como um dicionário fonético. No Brasil, ao contrário do que acontece para a língua inglesa, por exemplo, não existe atualmente em domínio público um sistema de Reconhecimento Automático de Voz (RAV) para o Português Brasileiro com suporte a grandes vocabulários. Frente a este cenário, o trabalho tem como principal objetivo discutir esforços dentro da iniciativa FalaBrasil [1], criada pelo Laboratório de Processamento de Sinais (LaPS) da UFPA, apresentando pesquisas e softwares na área de RAV para o Português do Brasil. Mais especificamente, o presente trabalho discute a implementação de um sistema de reconhecimento de voz com suporte a grandes vocabulários para o Português do Brasil, utilizando a ferramenta HTK baseada em modelo oculto de Markov (HMM) e a criação de um módulo de conversão grafema-fone, utilizando técnicas de aprendizado de máquina.
Resumo:
Sistemas de reconhecimento e síntese de voz são constituídos por módulos que dependem da língua e, enquanto existem muitos recursos públicos para alguns idiomas (p.e. Inglês e Japonês), os recursos para Português Brasileiro (PB) ainda são escassos. Outro aspecto é que, para um grande número de tarefas, a taxa de erro dos sistemas de reconhecimento de voz atuais ainda é elevada, quando comparada à obtida por seres humanos. Assim, apesar do sucesso das cadeias escondidas de Markov (HMM), é necessária a pesquisa por novos métodos. Este trabalho tem como motivação esses dois fatos e se divide em duas partes. A primeira descreve o desenvolvimento de recursos e ferramentas livres para reconhecimento e síntese de voz em PB, consistindo de bases de dados de áudio e texto, um dicionário fonético, um conversor grafema-fone, um separador silábico e modelos acústico e de linguagem. Todos os recursos construídos encontram-se publicamente disponíveis e, junto com uma interface de programação proposta, têm sido usados para o desenvolvimento de várias novas aplicações em tempo-real, incluindo um módulo de reconhecimento de voz para a suíte de aplicativos para escritório OpenOffice.org. São apresentados testes de desempenho dos sistemas desenvolvidos. Os recursos aqui produzidos e disponibilizados facilitam a adoção da tecnologia de voz para PB por outros grupos de pesquisa, desenvolvedores e pela indústria. A segunda parte do trabalho apresenta um novo método para reavaliar (rescoring) o resultado do reconhecimento baseado em HMMs, o qual é organizado em uma estrutura de dados do tipo lattice. Mais especificamente, o sistema utiliza classificadores discriminativos que buscam diminuir a confusão entre pares de fones. Para cada um desses problemas binários, são usadas técnicas de seleção automática de parâmetros para escolher a representaçãao paramétrica mais adequada para o problema em questão.