955 resultados para Markov Switching Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Malaria is a serious problem in the Brazilian Amazon region, and the detection of possible risk factors could be of great interest for public health authorities. The objective of this article was to investigate the association between environmental variables and the yearly registers of malaria in the Amazon region using Bayesian spatiotemporal methods. METHODS: We used Poisson spatiotemporal regression models to analyze the Brazilian Amazon forest malaria count for the period from 1999 to 2008. In this study, we included some covariates that could be important in the yearly prediction of malaria, such as deforestation rate. We obtained the inferences using a Bayesian approach and Markov Chain Monte Carlo (MCMC) methods to simulate samples for the joint posterior distribution of interest. The discrimination of different models was also discussed. RESULTS: The model proposed here suggests that deforestation rate, the number of inhabitants per km², and the human development index (HDI) are important in the prediction of malaria cases. CONCLUSIONS: It is possible to conclude that human development, population growth, deforestation, and their associated ecological alterations are conducive to increasing malaria risk. We conclude that the use of Poisson regression models that capture the spatial and temporal effects under the Bayesian paradigm is a good strategy for modeling malaria counts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern fully integrated transceivers architectures, require circuits with low area, low cost, low power, and high efficiency. A key block in modern transceivers is the power amplifier, which is deeply studied in this thesis. First, we study the implementation of a classical Class-A amplifier, describing the basic operation of an RF power amplifier, and analysing the influence of the real models of the reactive components in its operation. Secondly, the Class-E amplifier is deeply studied. The different types of implementations are reviewed and theoretical equations are derived and compared with simulations. There were selected four modes of operation for the Class-E amplifier, in order to perform the implementation of the output stage, and the subsequent comparison of results. This led to the selection of the mode with the best trade-off between efficiency and harmonics distortion, lower power consumption and higher output power. The optimal choice was a parallel circuit containing an inductor with a finite value. To complete the implementation of the PA in switching mode, a driver was implemented. The final block (output stage together with the driver) got 20 % total efficiency (PAE) transmitting 8 dBm output power to a 50 W load with a total harmonic distortion (THD) of 3 % and a total consumption of 28 mW. All implementations are designed using standard 130 nm CMOS technology. The operating frequency is 2.4 GHz and it was considered an 1.2 V DC power supply. The proposed circuit is intended to be used in a Bluetooth transmitter, however, it has a wider range of applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Patients with rheumatoid arthritis (RA) with an inadequate response to TNF antagonists (aTNFs) may switch to an alternative aTNF or start treatment from a different class of drugs, such as rituximab (RTX). It remains unclear in which clinical settings these therapeutic strategies offer most benefit. OBJECTIVE: To analyse the effectiveness of RTX versus alternative aTNFs on RA disease activity in different subgroups of patients. METHODS: A prospective cohort study of patients with RA who discontinued at least one aTNF and subsequently received either RTX or an alternative aTNF, nested within the Swiss RA registry (SCQM-RA) was carried out. The primary outcome, longitudinal improvement in 28-joint count Disease Activity Score (DAS28), was analysed using multivariate regression models for longitudinal data and adjusted for potential confounders. RESULTS: Of the 318 patients with RA included; 155 received RTX and 163 received an alternative aTNF. The relative benefit of RTX varied with the type of prior aTNF failure: when the motive for switching was ineffectiveness to previous aTNFs, the longitudinal improvement in DAS28 was significantly better with RTX than with an alternative aTNF (p = 0.03; at 6 months, -1.34 (95% CI -1.54 to -1.15) vs -0.93 (95% CI -1.28 to -0.59), respectively). When the motive for switching was other causes, the longitudinal improvement in DAS28 was similar for RTX and alternative aTNFs (p = 0.40). These results were not significantly modified by the number of previous aTNF failures, the type of aTNF switches, or the presence of co-treatment with a disease-modifying antirheumatic drug. CONCLUSION: This observational study suggests that in patients with RA who have stopped a previous aTNF treatment because of ineffectiveness changing to RTX is more effective than switching to an alternative aTNF.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study a business cycle model in which a benevolent fiscal authority must determine the optimal provision of government services, while lacking credibility, lump-sum taxes, and the ability to bond finance deficits. Households and the fiscal authority have risk sensitive preferences. We find that outcomes are affected importantly by the household's risk sensitivity, but not by the fiscal authority's. Further, while household risk-sensitivity induces a strong precautionary saving motive, which raises capital and lowers the return on assets, its effects on fluctuations and the business cycle are generally small, although more pronounced for negative shocks. Holding the stochastic steady state constant, increases in household risk-sensitivity lower the risk-free rate and raise the return on equity, increasing the equity premium. Finally, although risk-sensitivity has little effect on the provision of government services, it does cause the fiscal authority to lower the income tax rate. An additional contribution of this paper is to present a method for computing Markov-perfect equilibria in models where private agents and the government are risk-sensitive decisionmakers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FtsK acts at the bacterial division septum to couple chromosome segregation with cell division. We demonstrate that a truncated FtsK derivative, FtsK(50C), uses ATP hydrolysis to translocate along duplex DNA as a multimer in vitro, consistent with FtsK having an in vivo role in pumping DNA through the closing division septum. FtsK(50C) also promotes a complete Xer recombination reaction between dif sites by switching the state of activity of the XerCD recombinases so that XerD makes the first pair of strand exchanges to form Holliday junctions that are then resolved by XerC. The reaction between directly repeated dif sites in circular DNA leads to the formation of uncatenated circles and is equivalent to the formation of chromosome monomers from dimers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In vitro aggregating brain cell cultures containing all types of brain cells have been shown to be useful for neurotoxicological investigations. The cultures are used for the detection of nervous system-specific effects of compounds by measuring multiple endpoints, including changes in enzyme activities. Concentration-dependent neurotoxicity is determined at several time points. METHODS: A Markov model was set up to describe the dynamics of brain cell populations exposed to potentially neurotoxic compounds. Brain cells were assumed to be either in a healthy or stressed state, with only stressed cells being susceptible to cell death. Cells may have switched between these states or died with concentration-dependent transition rates. Since cell numbers were not directly measurable, intracellular lactate dehydrogenase (LDH) activity was used as a surrogate. Assuming that changes in cell numbers are proportional to changes in intracellular LDH activity, stochastic enzyme activity models were derived. Maximum likelihood and least squares regression techniques were applied for estimation of the transition rates. Likelihood ratio tests were performed to test hypotheses about the transition rates. Simulation studies were used to investigate the performance of the transition rate estimators and to analyze the error rates of the likelihood ratio tests. The stochastic time-concentration activity model was applied to intracellular LDH activity measurements after 7 and 14 days of continuous exposure to propofol. The model describes transitions from healthy to stressed cells and from stressed cells to death. RESULTS: The model predicted that propofol would affect stressed cells more than healthy cells. Increasing propofol concentration from 10 to 100 μM reduced the mean waiting time for transition to the stressed state by 50%, from 14 to 7 days, whereas the mean duration to cellular death reduced more dramatically from 2.7 days to 6.5 hours. CONCLUSION: The proposed stochastic modeling approach can be used to discriminate between different biological hypotheses regarding the effect of a compound on the transition rates. The effects of different compounds on the transition rate estimates can be quantitatively compared. Data can be extrapolated at late measurement time points to investigate whether costs and time-consuming long-term experiments could possibly be eliminated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a parsimonious regime-switching approach to model the correlations between assets, the threshold conditional correlation (TCC) model. This method allows the dynamics of the correlations to change from one state (or regime) to another as a function of observable transition variables. Our model is similar in spirit to Silvennoinen and Teräsvirta (2009) and Pelletier (2006) but with the appealing feature that it does not suffer from the course of dimensionality. In particular, estimation of the parameters of the TCC involves a simple grid search procedure. In addition, it is easy to guarantee a positive definite correlation matrix because the TCC estimator is given by the sample correlation matrix, which is positive definite by construction. The methodology is illustrated by evaluating the behaviour of international equities, govenrment bonds and major exchange rates, first separately and then jointly. We also test and allow for different parts in the correlation matrix to be governed by different transition variables. For this, we estimate a multi-threshold TCC specification. Further, we evaluate the economic performance of the TCC model against a constant conditional correlation (CCC) estimator using a Diebold-Mariano type test. We conclude that threshold correlation modelling gives rise to a significant reduction in portfolio´s variance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a method to conduct inference in panel VAR models with cross unit interdependencies and time variations in the coefficients. The approach can be used to obtain multi-unit forecasts and leading indicators and to conduct policy analysis in a multiunit setups. The framework of analysis is Bayesian and MCMC methods are used to estimate the posterior distribution of the features of interest. The model is reparametrized to resemble an observable index model and specification searches are discussed. As an example, we construct leading indicators for inflation and GDP growth in the Euro area using G-7 information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a methodology to estimate the coefficients, to test specification hypothesesand to conduct policy exercises in multi-country VAR models with cross unit interdependencies, unit specific dynamics and time variations in the coefficients. The framework of analysis is Bayesian: a prior flexibly reduces the dimensionality of the model and puts structure on the time variations; MCMC methods are used to obtain posterior distributions; and marginal likelihoods to check the fit of various specifications. Impulse responses and conditional forecasts are obtained with the output of MCMC routine. The transmission of certain shocks across countries is analyzed.