1000 resultados para Canvis climàtics -- Models matemàtics
Resumo:
Although dispersal is recognized as a key issue in several fields of population biology (such as behavioral ecology, population genetics, metapopulation dynamics or evolutionary modeling), these disciplines focus on different aspects of the concept and often make different implicit assumptions regarding migration models. Using simulations, we investigate how such assumptions translate into effective gene flow and fixation probability of selected alleles. Assumptions regarding migration type (e.g. source-sink, resident pre-emption, or balanced dispersal) and patterns (e.g. stepping-stone versus island dispersal) have large impacts when demes differ in sizes or selective pressures. The effects of fragmentation, as well as the spatial localization of newly arising mutations, also strongly depend on migration type and patterns. Migration rate also matters: depending on the migration type, fixation probabilities at an intermediate migration rate may lie outside the range defined by the low- and high-migration limits when demes differ in sizes. Given the extreme sensitivity of fixation probability to characteristics of dispersal, we underline the importance of making explicit (and documenting empirically) the crucial ecological/ behavioral assumptions underlying migration models.
Resumo:
Four general equilibrium search models are compared quantitatively. Thebaseline framework is a calibrated macroeconomic model of the US economydesigned for a welfare analysis of unemployment insurance policy. Theother models make three simple and natural specification changes,regarding tax incidence, monopsony power in wage determination, and therelevant threat point. These specification changes have a major impacton the equilibrium and on the welfare implications of unemploymentinsurance, partly because search externalities magnify the effects ofwage changes. The optimal level of unemployment insurance dependsstrongly on whether raising benefits has a larger impact on searcheffort or on hiring expenditure.
Resumo:
We propose a method to estimate time invariant cyclical DSGE models using the informationprovided by a variety of filters. We treat data filtered with alternative procedures as contaminated proxies of the relevant model-based quantities and estimate structural and non-structuralparameters jointly using a signal extraction approach. We employ simulated data to illustratethe properties of the procedure and compare our conclusions with those obtained when just onefilter is used. We revisit the role of money in the transmission of monetary business cycles.
Resumo:
In this paper we use Malliavin calculus techniques to obtain an expression for the short-time behavior of the at-the-money implied volatility skew for a generalization of the Bates model, where the volatility does not need to be neither a difussion, nor a Markov process as the examples in section 7 show. This expression depends on the derivative of the volatility in the sense of Malliavin calculus.
Resumo:
In this paper we propose a metaheuristic to solve a new version of the Maximum CaptureProblem. In the original MCP, market capture is obtained by lower traveling distances or lowertraveling time, in this new version not only the traveling time but also the waiting time willaffect the market share. This problem is hard to solve using standard optimization techniques.Metaheuristics are shown to offer accurate results within acceptable computing times.
Resumo:
This paper discusses inference in self exciting threshold autoregressive (SETAR)models. Of main interest is inference for the threshold parameter. It iswell-known that the asymptotics of the corresponding estimator depend uponwhether the SETAR model is continuous or not. In the continuous case, thelimiting distribution is normal and standard inference is possible. Inthe discontinuous case, the limiting distribution is non-normal and cannotbe estimated consistently. We show valid inference can be drawn by theuse of the subsampling method. Moreover, the method can even be extendedto situations where the (dis)continuity of the model is unknown. In thiscase, also the inference for the regression parameters of the modelbecomes difficult and subsampling can be used advantageously there aswell. In addition, we consider an hypothesis test for the continuity ofthe SETAR model. A simulation study examines small sample performance.
Resumo:
We study the statistical properties of three estimation methods for a model of learning that is often fitted to experimental data: quadratic deviation measures without unobserved heterogeneity, and maximum likelihood withand without unobserved heterogeneity. After discussing identification issues, we show that the estimators are consistent and provide their asymptotic distribution. Using Monte Carlo simulations, we show that ignoring unobserved heterogeneity can lead to seriously biased estimations in samples which have the typical length of actual experiments. Better small sample properties areobtained if unobserved heterogeneity is introduced. That is, rather than estimating the parameters for each individual, the individual parameters are considered random variables, and the distribution of those random variables is estimated.
Resumo:
New location models are presented here for exploring the reduction of facilities in aregion. The first of these models considers firms ceding market share to competitorsunder situations of financial exigency. The goal of this model is to cede the leastmarket share, i.e., retain as much of the customer base as possible while sheddingcostly outlets. The second model considers a firm essentially without competition thatmust shrink it services for economic reasons. This firm is assumed to close outlets sothat the degradation of service is limited. An example is offered within a competitiveenvironment to demonstrate the usefulness of this modeling approach.
Resumo:
Models are presented for the optimal location of hubs in airline networks, that take into consideration the congestion effects. Hubs, which are the most congested airports, are modeled as M/D/c queuing systems, that is, Poisson arrivals, deterministic service time, and {\em c} servers. A formula is derived for the probability of a number of customers in the system, which is later used to propose a probabilistic constraint. This constraint limits the probability of {\em b} airplanes in queue, to be lesser than a value $\alpha$. Due to the computational complexity of the formulation. The model is solved using a meta-heuristic based on tabu search. Computational experience is presented.