375 resultados para Mathematics, interdisciplinary applications
Resumo:
Stability of matchings was proved to be a new cooperative equilibrium concept in Sotomayor (Dynamics and equilibrium: essays in honor to D. Gale, 1992). That paper introduces the innovation of treating as multi-dimensional the payoff of a player with a quota greater than one. This is done for the many-to-many matching model with additively separable utilities, for which the stability concept is defined. It is then proved, via linear programming, that the set of stable outcomes is nonempty and it may be strictly bigger than the set of dual solutions and strictly smaller than the core. The present paper defines a general concept of stability and shows that this concept is a natural solution concept, stronger than the core concept, for a much more general coalitional game than a matching game. Instead of mutual agreements inside partnerships, the players are allowed to make collective agreements inside coalitions of any size and to distribute his labor among them. A collective agreement determines the level of labor at which the coalition operates and the division, among its members, of the income generated by the coalition. An allocation specifies a set of collective agreements for each player.
Resumo:
In a decentralized setting the game-theoretical predictions are that only strong blockings are allowed to rupture the structure of a matching. This paper argues that, under indifferences, also weak blockings should be considered when these blockings come from the grand coalition. This solution concept requires stability plus Pareto optimality. A characterization of the set of Pareto-stable matchings for the roommate and the marriage models is provided in terms of individually rational matchings whose blocking pairs, if any, are formed with unmatched agents. These matchings always exist and give an economic intuition on how blocking can be done by non-trading agents, so that the transactions need not be undone as agents reach the set of stable matchings. Some properties of the Pareto-stable matchings shared by the Marriage and Roommate models are obtained.
Resumo:
Starting with an initial price vector, prices are adjusted in order to eliminate the excess demand and at the same time to keep the transfers to the sellers as low as possible. In each step of the auction, to which set of sellers should those transfers be made is the key issue in the description of the algorithm. We assume additively separable utilities and introduce a novel distinction by considering multiple sellers owing multiple identical objects and multiple buyers with an exogenously defined quota, consuming more than one object but at most one unit of a seller`s good and having multi-dimensional payoffs. This distinction induces a necessarily more complicated construction of the over-demanded sets than the constructions of these sets for the other assignment games. For this approach, our mechanism yields the buyer-optimal competitive equilibrium payoff, which equals the buyer-optimal stable payoff. The symmetry of the model allows to getting the seller-optimal stable payoff and the seller-optimal competitive equilibrium payoff can then be also derived.
Resumo:
A stable matching rule is used as the outcome function for the Admission game where colleges behave straightforwardly and the students` strategies are given by their preferences over the colleges. We show that the college-optimal stable matching rule implements the set of stable matchings via the Nash equilibrium (NE) concept. For any other stable matching rule the strategic behavior of the students may lead to outcomes that are not stable under the true preferences. We then introduce uncertainty about the matching selected and prove that the natural solution concept is that of NE in the strong sense. A general result shows that the random stable matching rule, as well as any stable matching rule, implements the set of stable matchings via NE in the strong sense. Precise answers are given to the strategic questions raised.
Resumo:
We show the results in Chalishajar [Controllability of mixed Volterra-Fredholm-type integro-differential systems in Banach space, J. Franklin Inst. 344(1) (2007) 12-21] and Chang and Chalishajar [Controllability of mixed Volterra-Fredholm type integro-differential systems in Banach space, J. Franklin Inst., doi:10.1016/j. jfranklin.2008.02.002] are only valid for ordinary differential control systems. As a result the examples provided cannot be recovered as applications of the abstract results. (C) 2008 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
The spread of an infectious disease in a population involves interactions leading to an epidemic outbreak through a network of contacts. Extending on Watts and Strogatz (1998) who showed that short-distance connections create a small-world effect, a model combining short-and long-distance probabilistic and regularly updated contacts helps considering spatial heterogeneity. The method is based on cellular automata. The presence of long-distance connections accelerates the small-world effect, as if the world shrank in proportion of their total number.
Resumo:
Immunological systems have been an abundant inspiration to contemporary computer scientists. Problem solving strategies, stemming from known immune system phenomena, have been successfully applied to chall enging problems of modem computing. Simulation systems and mathematical modeling are also beginning use to answer more complex immunological questions as immune memory process and duration of vaccines, where the regulation mechanisms are not still known sufficiently (Lundegaard, Lund, Kesmir, Brunak, Nielsen, 2007). In this article we studied in machina a approach to simulate the process of antigenic mutation and its implications for the process of memory. Our results have suggested that the durability of the immune memory is affected by the process of antigenic mutation.and by populations of soluble antibodies in the blood. The results also strongly suggest that the decrease of the production of antibodies favors the global maintenance of immune memory.
Resumo:
The true incidence of infectious diseases is difficult to determine from surveillance or from notification data. The proportion of new infections of rubella yields a model from serological surveys. The discrepancy between results and official notification data before vaccination era leads one to suspect the presence of hidden infections. Simulation on 80% of effective vaccination coverage shows a similar discrepancy of the total number of infections compared to notification data.
Resumo:
The population dynamics of stray dogs is simulated to assess the effects of sterilization and euthanasia. From simulations representing less than 5 years, sterilization is less efficient than euthanasia to reduce the stray dog population, considering similar rates, but the total number of sterilized dogs is less than the total number of euthanized dogs per km(2) per year. Over 20 years, both strategies have similar efficiency. Beyond a certain rate of dog abandonment, both strategies are inefficient.
Resumo:
This report is a review of Darwin`s classical theory of bodily tides in which we present the analytical expressions for the orbital and rotational evolution of the bodies and for the energy dissipation rates due to their tidal interaction. General formulas are given which do not depend on any assumption linking the tidal lags to the frequencies of the corresponding tidal waves (except that equal frequency harmonics are assumed to span equal lags). Emphasis is given to the cases of companions having reached one of the two possible final states: (1) the super-synchronous stationary rotation resulting from the vanishing of the average tidal torque; (2) capture into the 1:1 spin-orbit resonance (true synchronization). In these cases, the energy dissipation is controlled by the tidal harmonic with period equal to the orbital period (instead of the semi-diurnal tide) and the singularity due to the vanishing of the geometric phase lag does not exist. It is also shown that the true synchronization with non-zero eccentricity is only possible if an extra torque exists opposite to the tidal torque. The theory is developed assuming that this additional torque is produced by an equatorial permanent asymmetry in the companion. The results are model-dependent and the theory is developed only to the second degree in eccentricity and inclination (obliquity). It can easily be extended to higher orders, but formal accuracy will not be a real improvement as long as the physics of the processes leading to tidal lags is not better known.
Resumo:
The theory of diffusion in many-dimensional Hamiltonian system is applied to asteroidal dynamics. The general formulation developed by Chirikov is applied to the NesvornA1/2-Morbidelli analytic model of three-body (three-orbit) mean-motion resonances (Jupiter-Saturn-asteroid). In particular, we investigate the diffusion along and across the separatrices of the (5, -2, -2) resonance of the (490) Veritas asteroidal family and their relationship to diffusion in semi-major axis and eccentricity. The estimations of diffusion were obtained using the Melnikov integral, a Hadjidemetriou-type sympletic map and numerical integrations for times up to 10(8) years.
Resumo:
Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, we consider some non-homogeneous Poisson models to estimate the probability that an air quality standard is exceeded a given number of times in a time interval of interest. We assume that the number of exceedances occurs according to a non-homogeneous Poisson process (NHPP). This Poisson process has rate function lambda(t), t >= 0, which depends on some parameters that must be estimated. We take into account two cases of rate functions: the Weibull and the Goel-Okumoto. We consider models with and without change-points. When the presence of change-points is assumed, we may have the presence of either one, two or three change-points, depending of the data set. The parameters of the rate functions are estimated using a Gibbs sampling algorithm. Results are applied to ozone data provided by the Mexico City monitoring network. In a first instance, we assume that there are no change-points present. Depending on the adjustment of the model, we assume the presence of either one, two or three change-points. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
In this paper, we consider the problem of estimating the number of times an air quality standard is exceeded in a given period of time. A non-homogeneous Poisson model is proposed to analyse this issue. The rate at which the Poisson events occur is given by a rate function lambda(t), t >= 0. This rate function also depends on some parameters that need to be estimated. Two forms of lambda(t), t >= 0 are considered. One of them is of the Weibull form and the other is of the exponentiated-Weibull form. The parameters estimation is made using a Bayesian formulation based on the Gibbs sampling algorithm. The assignation of the prior distributions for the parameters is made in two stages. In the first stage, non-informative prior distributions are considered. Using the information provided by the first stage, more informative prior distributions are used in the second one. The theoretical development is applied to data provided by the monitoring network of Mexico City. The rate function that best fit the data varies according to the region of the city and/or threshold that is considered. In some cases the best fit is the Weibull form and in other cases the best option is the exponentiated-Weibull. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
In this paper we make use of some stochastic volatility models to analyse the behaviour of a weekly ozone average measurements series. The models considered here have been used previously in problems related to financial time series. Two models are considered and their parameters are estimated using a Bayesian approach based on Markov chain Monte Carlo (MCMC) methods. Both models are applied to the data provided by the monitoring network of the Metropolitan Area of Mexico City. The selection of the best model for that specific data set is performed using the Deviance Information Criterion and the Conditional Predictive Ordinate method.