906 resultados para stochastic geometry
Resumo:
Although native to the tropical and subtropical areas of Southeast Asia, Aedes albopictus is now found on five continents, primarily due to its great capacity to adapt to different environments. This species is considered a secondary vector of dengue virus in several countries. Wing geometric morphometrics is widely used to furnish morphological markers for the characterisation and identification of species of medical importance and for the assessment of population dynamics. In this work, we investigated the metric differentiation of the wings of Ae. albopictus samples collected over a four-year period (2007-2010) in São Paulo, Brazil. Wing size significantly decreased during this period for both sexes and the wing shape also changed over time, with the wing shapes of males showing greater differences after 2008 and those of females differing more after 2009. Given that the wings play sex-specific roles, these findings suggest that the males and females could be affected by differential evolutionary pressures. Consistent with this hypothesis, a sexually dimorphic pattern was detected and quantified: the females were larger than the males (with respect to the mean) and had a distinct wing shape, regardless of allometric effects. In conclusion, wing alterations, particularly those involving shape, are a sensitive indicator of microevolutionary processes in this species.
Resumo:
There are many factors that influence the day-ahead market bidding strategies of a generation company (GenCo) in the current energy market framework. Environmental policy issues have become more and more important for fossil-fuelled power plants and they have to be considered in their management, giving rise to emission limitations. This work allows to investigate the influence of both the allowances and emission reduction plan, and the incorporation of the derivatives medium-term commitments in the optimal generation bidding strategy to the day-ahead electricity market. Two different technologies have been considered: the coal thermal units, high-emission technology, and the combined cycle gas turbine units, low-emission technology. The Iberian Electricity Market and the Spanish National Emissions and Allocation Plans are the framework to deal with the environmental issues in the day-ahead market bidding strategies. To address emission limitations, some of the standard risk management methodologies developed for financial markets, such as Value-at-Risk (VaR) and Conditional Value-at-Risk (CVaR), have been extended. This study offers to electricity generation utilities a mathematical model to determinate the individual optimal generation bid to the wholesale electricity market, for each one of their generation units that maximizes the long-run profits of the utility abiding by the Iberian Electricity Market rules, the environmental restrictions set by the EU Emission Trading Scheme, as well as the restrictions set by the Spanish National Emissions Reduction Plan. The economic implications for a GenCo of including the environmental restrictions of these National Plans are analyzed and the most remarkable results will be presented.
Resumo:
Animals can often coordinate their actions to achieve mutually beneficial outcomes. However, this can result in a social dilemma when uncertainty about the behavior of partners creates multiple fitness peaks. Strategies that minimize risk ("risk dominant") instead of maximizing reward ("payoff dominant") are favored in economic models when individuals learn behaviors that increase their payoffs. Specifically, such strategies are shown to be "stochastically stable" (a refinement of evolutionary stability). Here, we extend the notion of stochastic stability to biological models of continuous phenotypes at a mutation-selection-drift balance. This allows us to make a unique prediction for long-term evolution in games with multiple equilibria. We show how genetic relatedness due to limited dispersal and scaled to account for local competition can crucially affect the stochastically-stable outcome of coordination games. We find that positive relatedness (weak local competition) increases the chance the payoff dominant strategy is stochastically stable, even when it is not risk dominant. Conversely, negative relatedness (strong local competition) increases the chance that strategies evolve that are neither payoff nor risk dominant. Extending our results to large multiplayer coordination games we find that negative relatedness can create competition so extreme that the game effectively changes to a hawk-dove game and a stochastically stable polymorphism between the alternative strategies evolves. These results demonstrate the usefulness of stochastic stability in characterizing long-term evolution of continuous phenotypes: the outcomes of multiplayer games can be reduced to the generic equilibria of two-player games and the effect of spatial structure can be analyzed readily.
Resumo:
Demosaicking is a particular case of interpolation problems where, from a scalar image in which each pixel has either the red, the green or the blue component, we want to interpolate the full-color image. State-of-the-art demosaicking algorithms perform interpolation along edges, but these edges are estimated locally. We propose a level-set-based geometric method to estimate image edges, inspired by the image in-painting literature. This method has a time complexity of O(S) , where S is the number of pixels in the image, and compares favorably with the state-of-the-art algorithms both visually and in most relevant image quality measures.
Resumo:
A complete life cycle model for northern corn rootworm, Diabrotica barberi Smith and Lawrence, is developed using a published single-season model of adult population dynamics and data from field experiments. Temperature-dependent development and age-dependent advancement determine adult population dynamics and oviposition, while a simple stochastic hatch and density-dependent larval survival model determine adult emergence. Dispersal is not modeled. To evaluate the long-run performance of the model, stochastically generated daily air and soil temperatures are used for 100-year simulations for a variety of corn planting and flowering dates in Ithaca, NY, and Brookings, SD. Once the model is corrected for a bias in oviposition, model predictions for both locations are consistent with anecdotal field data. Extinctions still occur, but these may be consistent with northern corn rootworm metapopulation dynamics.
Resumo:
Cultural variation in a population is affected by the rate of occurrence of cultural innovations, whether such innovations are preferred or eschewed, how they are transmitted between individuals in the population, and the size of the population. An innovation, such as a modification in an attribute of a handaxe, may be lost or may become a property of all handaxes, which we call "fixation of the innovation." Alternatively, several innovations may attain appreciable frequencies, in which case properties of the frequency distribution-for example, of handaxe measurements-is important. Here we apply the Moran model from the stochastic theory of population genetics to study the evolution of cultural innovations. We obtain the probability that an initially rare innovation becomes fixed, and the expected time this takes. When variation in cultural traits is due to recurrent innovation, copy error, and sampling from generation to generation, we describe properties of this variation, such as the level of heterogeneity expected in the population. For all of these, we determine the effect of the mode of social transmission: conformist, where there is a tendency for each naïve newborn to copy the most popular variant; pro-novelty bias, where the newborn prefers a specific variant if it exists among those it samples; one-to-many transmission, where the variant one individual carries is copied by all newborns while that individual remains alive. We compare our findings with those predicted by prevailing theories for rates of cultural change and the distribution of cultural variation.
Resumo:
The Aitchison vector space structure for the simplex is generalized to a Hilbert space structure A2(P) for distributions and likelihoods on arbitrary spaces. Centralnotations of statistics, such as Information or Likelihood, can be identified in the algebraical structure of A2(P) and their corresponding notions in compositional data analysis, such as Aitchison distance or centered log ratio transform.In this way very elaborated aspects of mathematical statistics can be understoodeasily in the light of a simple vector space structure and of compositional data analysis. E.g. combination of statistical information such as Bayesian updating,combination of likelihood and robust M-estimation functions are simple additions/perturbations in A2(Pprior). Weighting observations corresponds to a weightedaddition of the corresponding evidence.Likelihood based statistics for general exponential families turns out to have aparticularly easy interpretation in terms of A2(P). Regular exponential families formfinite dimensional linear subspaces of A2(P) and they correspond to finite dimensionalsubspaces formed by their posterior in the dual information space A2(Pprior).The Aitchison norm can identified with mean Fisher information. The closing constant itself is identified with a generalization of the cummulant function and shown to be Kullback Leiblers directed information. Fisher information is the local geometry of the manifold induced by the A2(P) derivative of the Kullback Leibler information and the space A2(P) can therefore be seen as the tangential geometry of statistical inference at the distribution P.The discussion of A2(P) valued random variables, such as estimation functionsor likelihoods, give a further interpretation of Fisher information as the expected squared norm of evidence and a scale free understanding of unbiased reasoning
Resumo:
In this paper we proose the infimum of the Arrow-Pratt index of absoluterisk aversion as a measure of global risk aversion of a utility function.We then show that, for any given arbitrary pair of distributions, thereexists a threshold level of global risk aversion such that all increasingconcave utility functions with at least as much global risk aversion wouldrank the two distributions in the same way. Furthermore, this thresholdlevel is sharp in the sense that, for any lower level of global riskaversion, we can find two utility functions in this class yielding oppositepreference relations for the two distributions.
Resumo:
The achievable region approach seeks solutions to stochastic optimisation problems by: (i) characterising the space of all possible performances(the achievable region) of the system of interest, and (ii) optimisingthe overall system-wide performance objective over this space. This isradically different from conventional formulations based on dynamicprogramming. The approach is explained with reference to a simpletwo-class queueing system. Powerful new methodologies due to the authorsand co-workers are deployed to analyse a general multiclass queueingsystem with parallel servers and then to develop an approach to optimalload distribution across a network of interconnected stations. Finally,the approach is used for the first time to analyse a class of intensitycontrol problems.
Resumo:
Most research on single machine scheduling has assumedthe linearity of job holding costs, which is arguablynot appropriate in some applications. This motivates ourstudy of a model for scheduling $n$ classes of stochasticjobs on a single machine, with the objective of minimizingthe total expected holding cost (discounted or undiscounted). We allow general holding cost rates that are separable,nondecreasing and convex on the number of jobs in eachclass. We formulate the problem as a linear program overa certain greedoid polytope, and establish that it issolved optimally by a dynamic (priority) index rule,whichextends the classical Smith's rule (1956) for the linearcase. Unlike Smith's indices, defined for each class, ournew indices are defined for each extended class, consistingof a class and a number of jobs in that class, and yieldan optimal dynamic index rule: work at each time on a jobwhose current extended class has larger index. We furthershow that the indices possess a decomposition property,as they are computed separately for each class, andinterpret them in economic terms as marginal expected cost rate reductions per unit of expected processing time.We establish the results by deploying a methodology recentlyintroduced by us [J. Niño-Mora (1999). "Restless bandits,partial conservation laws, and indexability. "Forthcomingin Advances in Applied Probability Vol. 33 No. 1, 2001],based on the satisfaction by performance measures of partialconservation laws (PCL) (which extend the generalizedconservation laws of Bertsimas and Niño-Mora (1996)):PCL provide a polyhedral framework for establishing theoptimality of index policies with special structure inscheduling problems under admissible objectives, which weapply to the model of concern.
Resumo:
Using a suitable Hull and White type formula we develop a methodology to obtain asecond order approximation to the implied volatility for very short maturities. Using thisapproximation we accurately calibrate the full set of parameters of the Heston model. Oneof the reasons that makes our calibration for short maturities so accurate is that we alsotake into account the term-structure for large maturities. We may say that calibration isnot "memoryless", in the sense that the option's behavior far away from maturity doesinfluence calibration when the option gets close to expiration. Our results provide a wayto perform a quick calibration of a closed-form approximation to vanilla options that canthen be used to price exotic derivatives. The methodology is simple, accurate, fast, andit requires a minimal computational cost.