995 resultados para Gibbs excess models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Four general equilibrium search models are compared quantitatively. Thebaseline framework is a calibrated macroeconomic model of the US economydesigned for a welfare analysis of unemployment insurance policy. Theother models make three simple and natural specification changes,regarding tax incidence, monopsony power in wage determination, and therelevant threat point. These specification changes have a major impacton the equilibrium and on the welfare implications of unemploymentinsurance, partly because search externalities magnify the effects ofwage changes. The optimal level of unemployment insurance dependsstrongly on whether raising benefits has a larger impact on searcheffort or on hiring expenditure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a method to estimate time invariant cyclical DSGE models using the informationprovided by a variety of filters. We treat data filtered with alternative procedures as contaminated proxies of the relevant model-based quantities and estimate structural and non-structuralparameters jointly using a signal extraction approach. We employ simulated data to illustratethe properties of the procedure and compare our conclusions with those obtained when just onefilter is used. We revisit the role of money in the transmission of monetary business cycles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we use Malliavin calculus techniques to obtain an expression for the short-time behavior of the at-the-money implied volatility skew for a generalization of the Bates model, where the volatility does not need to be neither a difussion, nor a Markov process as the examples in section 7 show. This expression depends on the derivative of the volatility in the sense of Malliavin calculus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we propose a metaheuristic to solve a new version of the Maximum CaptureProblem. In the original MCP, market capture is obtained by lower traveling distances or lowertraveling time, in this new version not only the traveling time but also the waiting time willaffect the market share. This problem is hard to solve using standard optimization techniques.Metaheuristics are shown to offer accurate results within acceptable computing times.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses inference in self exciting threshold autoregressive (SETAR)models. Of main interest is inference for the threshold parameter. It iswell-known that the asymptotics of the corresponding estimator depend uponwhether the SETAR model is continuous or not. In the continuous case, thelimiting distribution is normal and standard inference is possible. Inthe discontinuous case, the limiting distribution is non-normal and cannotbe estimated consistently. We show valid inference can be drawn by theuse of the subsampling method. Moreover, the method can even be extendedto situations where the (dis)continuity of the model is unknown. In thiscase, also the inference for the regression parameters of the modelbecomes difficult and subsampling can be used advantageously there aswell. In addition, we consider an hypothesis test for the continuity ofthe SETAR model. A simulation study examines small sample performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the statistical properties of three estimation methods for a model of learning that is often fitted to experimental data: quadratic deviation measures without unobserved heterogeneity, and maximum likelihood withand without unobserved heterogeneity. After discussing identification issues, we show that the estimators are consistent and provide their asymptotic distribution. Using Monte Carlo simulations, we show that ignoring unobserved heterogeneity can lead to seriously biased estimations in samples which have the typical length of actual experiments. Better small sample properties areobtained if unobserved heterogeneity is introduced. That is, rather than estimating the parameters for each individual, the individual parameters are considered random variables, and the distribution of those random variables is estimated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While the theoretical industrial organization literature has long arguedthat excess capacity can be used to deter entry into markets, there islittle empirical evidence that incumbent firms effectively behave in thisway. Bagwell and Ramey (1996) propose a game with a specific sequence ofmoves and partially-recoverable capacity costs in which forward inductionprovides a theoretical rationalization for firm behavior in the field. Weconduct an experiment with a game inspired by their work. In our data theincumbent tends to keep the market, in contrast to what the forwardinduction argument of Bagwell and Ramey would suggest. The results indicatethat players perceive that the first mover has an advantage without havingto pre-commit capacity. In our game, evolution and learning do not driveout this perception. We back these claims with data analysis, atheoretical framework for dynamics, and simulation results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New location models are presented here for exploring the reduction of facilities in aregion. The first of these models considers firms ceding market share to competitorsunder situations of financial exigency. The goal of this model is to cede the leastmarket share, i.e., retain as much of the customer base as possible while sheddingcostly outlets. The second model considers a firm essentially without competition thatmust shrink it services for economic reasons. This firm is assumed to close outlets sothat the degradation of service is limited. An example is offered within a competitiveenvironment to demonstrate the usefulness of this modeling approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The magnitude of risk conferred by the interaction between tobacco and alcohol use on the risk of head and neck cancers is not clear because studies have used various methods to quantify the excess head and neck cancer burden. METHODS: We analyzed individual-level pooled data from 17 European and American case-control studies (11,221 cases and 16,168 controls) participating in the International Head and Neck Cancer Epidemiology consortium. We estimated the multiplicative interaction parameter (psi) and population attributable risks (PAR). RESULTS: A greater than multiplicative joint effect between ever tobacco and alcohol use was observed for head and neck cancer risk (psi = 2.15; 95% confidence interval, 1.53-3.04). The PAR for tobacco or alcohol was 72% (95% confidence interval, 61-79%) for head and neck cancer, of which 4% was due to alcohol alone, 33% was due to tobacco alone, and 35% was due to tobacco and alcohol combined. The total PAR differed by subsite (64% for oral cavity cancer, 72% for pharyngeal cancer, 89% for laryngeal cancer), by sex (74% for men, 57% for women), by age (33% for cases <45 years, 73% for cases >60 years), and by region (84% in Europe, 51% in North America, 83% in Latin America). CONCLUSIONS: Our results confirm that the joint effect between tobacco and alcohol use is greater than multiplicative on head and neck cancer risk. However, a substantial proportion of head and neck cancers cannot be attributed to tobacco or alcohol use, particularly for oral cavity cancer and for head and neck cancer among women and among young-onset cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models are presented for the optimal location of hubs in airline networks, that take into consideration the congestion effects. Hubs, which are the most congested airports, are modeled as M/D/c queuing systems, that is, Poisson arrivals, deterministic service time, and {\em c} servers. A formula is derived for the probability of a number of customers in the system, which is later used to propose a probabilistic constraint. This constraint limits the probability of {\em b} airplanes in queue, to be lesser than a value $\alpha$. Due to the computational complexity of the formulation. The model is solved using a meta-heuristic based on tabu search. Computational experience is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research on judgment and decision making presents a confusing picture of human abilities. For example, much research has emphasized the dysfunctional aspects of judgmental heuristics, and yet, other findings suggest that these can be highly effective. A further line of research has modeled judgment as resulting from as if linear models. This paper illuminates the distinctions in these approaches by providing a common analytical framework based on the central theoretical premise that understanding human performance requires specifying how characteristics of the decision rules people use interact with the demands of the tasks they face. Our work synthesizes the analytical tools of lens model research with novel methodology developed to specify the effectiveness of heuristics in different environments and allows direct comparisons between the different approaches. We illustrate with both theoretical analyses and simulations. We further link our results to the empirical literature by a meta-analysis of lens model studies and estimate both human andheuristic performance in the same tasks. Our results highlight the trade-off betweenlinear models and heuristics. Whereas the former are cognitively demanding, the latterare simple to use. However, they require knowledge and thus maps of when andwhich heuristic to employ.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we address the issue of locating hierarchical facilities in the presence of congestion. Two hierarchical models are presented, where lower level servers attend requests first, and then, some of the served customers are referred to higher level servers. In the first model, the objective is to find the minimum number of servers and theirlocations that will cover a given region with a distance or time standard. The second model is cast as a Maximal Covering Location formulation. A heuristic procedure is then presented together with computational experience. Finally, some extensions of these models that address other types of spatial configurations are offered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The common feature of urea cycle diseases (UCD) is a defect in ammonium elimination in liver, leading to hyperammonemia. This excess of circulating ammonium eventually reaches the central nervous system, where the main toxic effects of ammonium occur. These are reversible or irreversible, depending on the age of onset as well as the duration and the level of ammonium exposure. The brain is much more susceptible to the deleterious effects of ammonium during development than in adulthood, and surviving UCD patients may develop cortical and basal ganglia hypodensities, cortical atrophy, white matter atrophy or hypomyelination and ventricular dilatation. While for a long time, the mechanisms leading to these irreversible effects of ammonium exposure on the brain remained poorly understood, these last few years have brought new data showing in particular that ammonium exposure alters several amino acid pathways and neurotransmitter systems, cerebral energy, nitric oxide synthesis, axonal and dendritic growth, signal transduction pathways, as well as K(+) and water channels. All these effects of ammonium on CNS may eventually lead to energy deficit, oxidative stress and cell death. Recent work also proposed neuroprotective strategies, such as the use of NMDA receptor antagonists, nitric oxide inhibitors, creatine and acetyl-l-carnitine, to counteract the toxic effects of ammonium. Better understanding the pathophysiology of ammonium toxicity to the brain under UCD will allow the development of new strategies for neuroprotection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this paper is to compare the performance of twopredictive radiological models, logistic regression (LR) and neural network (NN), with five different resampling methods. One hundred and sixty-seven patients with proven calvarial lesions as the only known disease were enrolled. Clinical and CT data were used for LR and NN models. Both models were developed with cross validation, leave-one-out and three different bootstrap algorithms. The final results of each model were compared with error rate and the area under receiver operating characteristic curves (Az). The neural network obtained statistically higher Az than LR with cross validation. The remaining resampling validation methods did not reveal statistically significant differences between LR and NN rules. The neural network classifier performs better than the one based on logistic regression. This advantage is well detected by three-fold cross-validation, but remains unnoticed when leave-one-out or bootstrap algorithms are used.