1000 resultados para vertex models
Resumo:
This chapter highlights the problems that structural methods and SVAR approaches have when estimating DSGE models and examining their ability to capture important features of the data. We show that structural methods are subject to severe identification problems due, in large part, to the nature of DSGE models. The problems can be patched up in a number of ways but solved only if DSGEs are completely reparametrized or respecified. The potential misspecification of the structural relationships give Bayesian methods an hedge over classical ones in structural estimation. SVAR approaches may face invertibility problems but simple diagnostics can help to detect and remedy these problems. A pragmatic empirical approach ought to use the flexibility of SVARs against potential misspecificationof the structural relationships but must firmly tie SVARs to the class of DSGE models which could have have generated the data.
Resumo:
Since ethical concerns are calling for more attention within OperationalResearch, we present three approaches to combine Operational Researchmodels with ethics. Our intention is to clarify the trade-offs faced bythe OR community, in particular the tension between the scientificlegitimacy of OR models (ethics outside OR models) and the integrationof ethics within models (ethics within OR models). Presenting anddiscussing an approach that combines OR models with the process of OR(ethics beyond OR models), we suggest rigorous ways to express the relationbetween ethics and OR models. As our work is exploratory, we are trying toavoid a dogmatic attitude and call for further research. We argue thatthere are interesting avenues for research at the theoretical,methodological and applied levels and that the OR community can contributeto an innovative, constructive and responsible social dialogue about itsethics.
Resumo:
This paper describes a methodology to estimate the coefficients, to test specification hypothesesand to conduct policy exercises in multi-country VAR models with cross unit interdependencies, unit specific dynamics and time variations in the coefficients. The framework of analysis is Bayesian: a prior flexibly reduces the dimensionality of the model and puts structure on the time variations; MCMC methods are used to obtain posterior distributions; and marginal likelihoods to check the fit of various specifications. Impulse responses and conditional forecasts are obtained with the output of MCMC routine. The transmission of certain shocks across countries is analyzed.
Resumo:
We analyze the role of commitment in pre-play communication for ensuringefficient evolutionarily stable outcomes in coordination games. All players are a priori identical as they are drawn from the same population. In games where efficient outcomes can be reached by players coordinating on the same action we find commitment to be necessary to enforce efficiency. In games where efficienct outcomes only result from play of different actions, communication without commitment is most effective although efficiency can no longer be guaranteed. Only when there are many messages then inefficient outcomes are negligible as their basins of attraction become very small.
Resumo:
When dealing with the design of service networks, such as healthand EMS services, banking or distributed ticket selling services, thelocation of service centers has a strong influence on the congestion ateach of them, and consequently, on the quality of service. In this paper,several models are presented to consider service congestion. The firstmodel addresses the issue of the location of the least number of single--servercenters such that all the population is served within a standard distance,and nobody stands in line for a time longer than a given time--limit, or withmore than a predetermined number of other clients. We then formulateseveral maximal coverage models, with one or more servers per service center.A new heuristic is developed to solve the models and tested in a 30--nodesnetwork.
Resumo:
This paper illustrates the philosophy which forms the basis of calibrationexercises in general equilibrium macroeconomic models and the details of theprocedure, the advantages and the disadvantages of the approach, with particularreference to the issue of testing ``false'' economic models. We provide anoverview of the most recent simulation--based approaches to the testing problemand compare them to standard econometric methods used to test the fit of non--lineardynamic general equilibrium models. We illustrate how simulation--based techniques can be used to formally evaluate the fit of a calibrated modelto the data and obtain ideas on how to improve the model design using a standardproblem in the international real business cycle literature, i.e. whether amodel with complete financial markets and no restrictions to capital mobility is able to reproduce the second order properties of aggregate savingand aggregate investment in an open economy.
Resumo:
The past four decades have witnessed an explosive growth in the field of networkbased facilitylocation modeling. This is not at all surprising since location policy is one of the mostprofitable areas of applied systems analysis in regional science and ample theoretical andapplied challenges are offered. Location-allocation models seek the location of facilitiesand/or services (e.g., schools, hospitals, and warehouses) so as to optimize one or severalobjectives generally related to the efficiency of the system or to the allocation of resources.This paper concerns the location of facilities or services in discrete space or networks, thatare related to the public sector, such as emergency services (ambulances, fire stations, andpolice units), school systems and postal facilities. The paper is structured as follows: first,we will focus on public facility location models that use some type of coverage criterion,with special emphasis in emergency services. The second section will examine models based onthe P-Median problem and some of the issues faced by planners when implementing thisformulation in real world locational decisions. Finally, the last section will examine newtrends in public sector facility location modeling.
Resumo:
The detection of Parkinson's disease (PD) in its preclinical stages prior to outright neurodegeneration is essential to the development of neuroprotective therapies and could reduce the number of misdiagnosed patients. However, early diagnosis is currently hampered by lack of reliable biomarkers. (1) H magnetic resonance spectroscopy (MRS) offers a noninvasive measure of brain metabolite levels that allows the identification of such potential biomarkers. This study aimed at using MRS on an ultrahigh field 14.1 T magnet to explore the striatal metabolic changes occurring in two different rat models of the disease. Rats lesioned by the injection of 6-hydroxydopamine (6-OHDA) in the medial-forebrain bundle were used to model a complete nigrostriatal lesion while a genetic model based on the nigral injection of an adeno-associated viral (AAV) vector coding for the human α-synuclein was used to model a progressive neurodegeneration and dopaminergic neuron dysfunction, thereby replicating conditions closer to early pathological stages of PD. MRS measurements in the striatum of the 6-OHDA rats revealed significant decreases in glutamate and N-acetyl-aspartate levels and a significant increase in GABA level in the ipsilateral hemisphere compared with the contralateral one, while the αSyn overexpressing rats showed a significant increase in the GABA striatal level only. Therefore, we conclude that MRS measurements of striatal GABA levels could allow for the detection of early nigrostriatal defects prior to outright neurodegeneration and, as such, offers great potential as a sensitive biomarker of presymptomatic PD.
Resumo:
Cannabinoid receptor 1 (CB(1) receptor) controls several neuronal functions, including neurotransmitter release, synaptic plasticity, gene expression and neuronal viability. Downregulation of CB(1) expression in the basal ganglia of patients with Huntington's disease (HD) and animal models represents one of the earliest molecular events induced by mutant huntingtin (mHtt). This early disruption of neuronal CB(1) signaling is thought to contribute to HD symptoms and neurodegeneration. Here we determined whether CB(1) downregulation measured in patients with HD and mouse models was ubiquitous or restricted to specific striatal neuronal subpopulations. Using unbiased semi-quantitative immunohistochemistry, we confirmed previous studies showing that CB(1) expression is downregulated in medium spiny neurons of the indirect pathway, and found that CB(1) is also downregulated in neuropeptide Y (NPY)/neuronal nitric oxide synthase (nNOS)-expressing interneurons while remaining unchanged in parvalbumin- and calretinin-expressing interneurons. CB(1) downregulation in striatal NPY/nNOS-expressing interneurons occurs in R6/2 mice, Hdh(Q150/Q150) mice and the caudate nucleus of patients with HD. In R6/2 mice, CB(1) downregulation in NPY/nNOS-expressing interneurons correlates with diffuse expression of mHtt in the soma. This downregulation also occludes the ability of cannabinoid agonists to activate the pro-survival signaling molecule cAMP response element-binding protein in NPY/nNOS-expressing interneurons. Loss of CB(1) signaling in NPY/nNOS-expressing interneurons could contribute to the impairment of basal ganglia functions linked to HD.
Resumo:
This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.
Resumo:
Eurymetopum is an Andean clerid genus with 22 species. We modeled the ecological niches of 19 species with Maxent and used them as potential distributional maps to identify patterns of richness and endemicity. All modeled species maps were overlapped in a single map in order to determine richness. We performed an optimality analysis with NDM/VNDM in a grid of 1º latitude-longitude in order to identify endemism. We found a highly rich area, located between 32º and 41º south latitude, where the richest pixels have 16 species. One area of endemism was identified, located in the Maule and Valdivian Forest biogeographic provinces, which extends also to the Santiago province of the Central Chilean subregion, and contains four endemic species (E. parallelum, E. prasinum, E. proteus, and E. viride), as well as 16 non-endemic species. The sympatry of these phylogenetically unrelated species might indicate ancient vicariance processes, followed by episodes of dispersal. Based on our results, we suggest a close relationship between these provinces, with the Maule representing a complex area.
Resumo:
This paper theoretically and empirically documents a puzzle that arises when an RBC economy with a job matching function is used to model unemployment. The standard model can generate sufficiently large cyclical fluctuations in unemployment, or a sufficiently small response of unemployment to labor market policies, but it cannot do both. Variable search and separation, finite UI benefit duration, efficiency wages, and capital all fail to resolve this puzzle. However, either sticky wages or match-specific productivity shocks can improve the model's performance by making the firm's flow of surplus more procyclical, which makes hiring more procyclical too.
Resumo:
In this paper I explore the issue of nonlinearity (both in the datageneration process and in the functional form that establishes therelationship between the parameters and the data) regarding the poorperformance of the Generalized Method of Moments (GMM) in small samples.To this purpose I build a sequence of models starting with a simple linearmodel and enlarging it progressively until I approximate a standard (nonlinear)neoclassical growth model. I then use simulation techniques to find the smallsample distribution of the GMM estimators in each of the models.
Resumo:
The paper proposes a numerical solution method for general equilibrium models with a continuum of heterogeneous agents, which combines elements of projection and of perturbation methods. The basic idea is to solve first for the stationary solutionof the model, without aggregate shocks but with fully specified idiosyncratic shocks. Afterwards one computes a first-order perturbation of the solution in the aggregate shocks. This approach allows to include a high-dimensional representation of the cross-sectional distribution in the state vector. The method is applied to a model of household saving with uninsurable income risk and liquidity constraints. The model includes not only productivity shocks, but also shocks to redistributive taxation, which cause substantial short-run variation in the cross-sectional distribution of wealth. If those shocks are operative, it is shown that a solution method based on very few statistics of the distribution is not suitable, while the proposed method can solve the model with high accuracy, at least for the case of small aggregate shocks. Techniques are discussed to reduce the dimension of the state space such that higher order perturbations are feasible.Matlab programs to solve the model can be downloaded.
Resumo:
A new direction of research in Competitive Location theory incorporatestheories of Consumer Choice Behavior in its models. Following thisdirection, this paper studies the importance of consumer behavior withrespect to distance or transportation costs in the optimality oflocations obtained by traditional Competitive Location models. To dothis, it considers different ways of defining a key parameter in thebasic Maximum Capture model (MAXCAP). This parameter will reflectvarious ways of taking into account distance based on several ConsumerChoice Behavior theories. The optimal locations and the deviation indemand captured when the optimal locations of the other models are usedinstead of the true ones, are computed for each model. A metaheuristicbased on GRASP and Tabu search procedure is presented to solve all themodels. Computational experience and an application to 55-node networkare also presented.