937 resultados para Bayesian smoothing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This comment corrects the errors in the estimation process that appear in Martins (2001). The first error is in the parametric probit estimation, as the previously presented results do not maximize the log-likelihood function. In the global maximum more variables become significant. As for the semiparametric estimation method, the kernel function used in Martins (2001) can take on both positive and negative values, which implies that the participation probability estimates may be outside the interval [0,1]. We have solved the problem by applying local smoothing in the kernel estimation, as suggested by Klein and Spady (1993).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze a continuous-time bilateral double auction in the presence of two-sided incomplete information and a smallest money unit. A distinguishing feature of our model is that intermediate concessions are not observable by the adversary: they are only communicated to a passive auctioneer. An alternative interpretation is that of mediated bargaining. We show that an equilibrium using only the extreme agreements always exists and display the necessary and sufficient condition for the existence of (perfect Bayesian) equilibra which yield intermediate agreements. For the symmetric case with uniform type distribution we numerically calculate the equilibria. We find that the equilibrium which does not use compromise agreements is the least efficient, however, the rest of the equilibria yield the lower social welfare the higher number of compromise agreements are used.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the incentives of candidates to enter or to exit elections in order to strategically affect the outcome of a voting correspondence. We extend the results of Dutta, Jackson and Le Breton (2000), who only considered single-valued voting procedures by admitting that the outcomes of voting may consist of sets of candidates. We show that, if candidates form their preferences over sets according to Expected Utility Theory and Bayesian updating, every unanimous and non dictatorial voting correspondence violates candidate stability. When candidates are restricted to use even chance prior distributions, only dictatorial or bidictatorial rules are unanimous and candidate stable. We also analyze the implications of using other extension criteria to define candidate stability that open the door to positive results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Macroeconomic activity has become less volatile over the past three decades in most G7 economies. Current literature focuses on the characterization of the volatility reduction and explanations for this so called "moderation" in each G7 economy separately. In opposed to individual country analysis and individual variable analysis, this paper focuses on common characteristics of the reduction and common explanations for the moderation in G7 countries. In particular, we study three explanations: structural changes in the economy, changes in common international shocks and changes in domestic shocks. We study these explanations in a unified model structure. To this end, we propose a Bayesian factor structural vector autoregressive model. Using the proposed model, we investigate whether we can find common explanations for all G7 economies when information is pooled from multiple domestic and international sources. Our empirical analysis suggests that volatility reductions can largely be attributed to the decline in the magnitudes of the shocks in most G7 countries while only for the U.K., the U.S. and Italy they can partially be attributed to structural changes in the economy. Analyzing the components of the volatility, we also find that domestic shocks rather than common international shocks can account for a large part of the volatility reduction in most of the G7 countries. Finally, we find that after mid-1980s the structure of the economy changes substantially in five of the G7 countries: Germany, Italy, Japan, the U.K. and the U.S..

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: While imatinib has revolutionized the treatment of chronic myeloid leukaemia (CML) and gastrointestinal stromal tumors (GIST), its pharmacokinetic-pharmacodynamic relationships have been poorly studied. This study aimed to explore the issue in oncologic patients, and to evaluate the specific influence of the target genotype in a GIST subpopulation. Patients and methods: Data from 59 patients (321 plasma samples) were collected during a previous pharmacokinetic study. Based on a population model purposely developed, individual post-hoc Bayesian estimates of pharmacokinetic parameters were derived, and used to estimate drug exposure (AUC; area under curve). Free fraction parameters were deduced from a model incorporating plasma alpha1-acid glycoprotein levels. Associations between AUC (or clearance) and therapeutic response (coded on a 3-point scale), or tolerability (4-point scale), were explored by ordered logistic regression. Influence of KIT genotype on response was also assessed in GIST patients. Results: Total and free drug exposure correlated with the number of side effects (p < 0.005). A relationship with response was not evident in the whole patient set (with good-responders tending to receive lower doses and bad-responders higher doses). In GIST patients however, higher free drug exposure predicted better responses. A strong association was notably observed in patients harboring an exon 9 mutation or a wild type KIT, known to decrease tumor sensitivity towards imatinib (p < 0.005). Conclusions: Our results are arguments to further evaluate the potential benefit of a therapeutic monitoring program for imatinib. Our data also suggest that stratification by genotype will be important in future trials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

C4 photosynthesis is an adaptation derived from the more common C3 photosynthetic pathway that confers a higher productivity under warm temperature and low atmospheric CO2 concentration [1, 2]. C4 evolution has been seen as a consequence of past atmospheric CO2 decline, such as the abrupt CO2 fall 32-25 million years ago (Mya) [3-6]. This relationship has never been tested rigorously, mainly because of a lack of accurate estimates of divergence times for the different C4 lineages [3]. In this study, we inferred a large phylogenetic tree for the grass family and estimated, through Bayesian molecular dating, the ages of the 17 to 18 independent grass C4 lineages. The first transition from C3 to C4 photosynthesis occurred in the Chloridoideae subfamily, 32.0-25.0 Mya. The link between CO2 decrease and transition to C4 photosynthesis was tested by a novel maximum likelihood approach. We showed that the model incorporating the atmospheric CO2 levels was significantly better than the null model, supporting the importance of CO2 decline on C4 photosynthesis evolvability. This finding is relevant for understanding the origin of C4 photosynthesis in grasses, which is one of the most successful ecological and evolutionary innovations in plant history.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we quantitatively assess the welfare implications of alternative public education spending rules. To this end, we employ a dynamic stochastic general equilibrium model in which human capital externalities and public education expenditures, nanced by distorting taxes, enhance the productivity of private education choices. We allow public education spending, as share of output, to respond to various aggregate indicators in an attempt to minimize the market imperfection due to human capital externalities. We also expose the economy to varying degrees of uncertainty via changes in the variance of total factor productivity shocks. Our results indicate that, in the face of increasing aggregate uncertainty, active policy can signi cantly outperform passive policy (i.e. maintaining a constant public education to output ratio) but only when the policy instrument is successful in smoothing the growth rate of human capital.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper contributes to the on-going empirical debate regarding the role of the RBC model and in particular of technology shocks in explaining aggregate fluctuations. To this end we estimate the model’s posterior density using Markov-Chain Monte-Carlo (MCMC) methods. Within this framework we extend Ireland’s (2001, 2004) hybrid estimation approach to allow for a vector autoregressive moving average (VARMA) process to describe the movements and co-movements of the model’s errors not explained by the basic RBC model. The results of marginal likelihood ratio tests reveal that the more general model of the errors significantly improves the model’s fit relative to the VAR and AR alternatives. Moreover, despite setting the RBC model a more difficult task under the VARMA specification, our analysis, based on forecast error and spectral decompositions, suggests that the RBC model is still capable of explaining a significant fraction of the observed variation in macroeconomic aggregates in the post-war U.S. economy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Block factor methods offer an attractive approach to forecasting with many predictors. These extract the information in these predictors into factors reflecting different blocks of variables (e.g. a price block, a housing block, a financial block, etc.). However, a forecasting model which simply includes all blocks as predictors risks being over-parameterized. Thus, it is desirable to use a methodology which allows for different parsimonious forecasting models to hold at different points in time. In this paper, we use dynamic model averaging and dynamic model selection to achieve this goal. These methods automatically alter the weights attached to different forecasting models as evidence comes in about which has forecast well in the recent past. In an empirical study involving forecasting output growth and inflation using 139 UK monthly time series variables, we find that the set of predictors changes substantially over time. Furthermore, our results show that dynamic model averaging and model selection can greatly improve forecast performance relative to traditional forecasting methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops methods for Stochastic Search Variable Selection (currently popular with regression and Vector Autoregressive models) for Vector Error Correction models where there are many possible restrictions on the cointegration space. We show how this allows the researcher to begin with a single unrestricted model and either do model selection or model averaging in an automatic and computationally efficient manner. We apply our methods to a large UK macroeconomic model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce duration dependent skill decay among the unemployed into a New-Keynesian model with hiring frictions developed by Blanchard/Gali (2008). If the central bank responds only to (current, lagged or expected future) inflation and quarterly skill decay is above a threshold level, determinacy requires a coefficient on inflation smaller than one. The threshold level is plausible with little steady-state hiring and firing ("Continental European Calibration") but implausibly high in the opposite case ("American calibration"). Neither interest rate smoothing nor responding to the output gap helps to restore determinacy if skill decay exceeds the threshold level. However, a modest response to unemployment guarantees determinacy. Moreover, under indeterminacy, both an adverse sunspot shock and an adverse technology shock increase unemployment extremely persistently.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper develops stochastic search variable selection (SSVS) for zero-inflated count models which are commonly used in health economics. This allows for either model averaging or model selection in situations with many potential regressors. The proposed techniques are applied to a data set from Germany considering the demand for health care. A package for the free statistical software environment R is provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The evolution of key innovations, novel traits that promote diversification, is often seen as major driver for the unequal distribution of species richness within the tree of life. In this study, we aim to determine the factors underlying the extraordinary radiation of the subfamily Bromelioideae, one of the most diverse clades among the neotropical plant family Bromeliaceae. Based on an extended molecular phylogenetic data set, we examine the effect of two putative key innovations, that is, the Crassulacean acid metabolism (CAM) and the water-impounding tank, on speciation and extinction rates. To this aim, we develop a novel Bayesian implementation of the phylogenetic comparative method, binary state speciation and extinction, which enables hypotheses testing by Bayes factors and accommodates the uncertainty on model selection by Bayesian model averaging. Both CAM and tank habit were found to correlate with increased net diversification, thus fulfilling the criteria for key innovations. Our analyses further revealed that CAM photosynthesis is correlated with a twofold increase in speciation rate, whereas the evolution of the tank had primarily an effect on extinction rates that were found five times lower in tank-forming lineages compared to tank-less clades. These differences are discussed in the light of biogeography, ecology, and past climate change.