7 resultados para Pairwise constraints

em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes auctions where bidders face nancial constraints that may force them to resell part of the property of the good (or subcontract part of a project) at a resale market. First we show that the ine¢ cient speculative equilibria of second- price auctions (Garratt and Tröger, 2006) generalizes to situations with partial resale where only the high value bidder is nancially constrained. However, when all players face nancial constraints the ine¢ cient speculative equilibria disappear. Therefore, for auctioning big facilities or contracts where all bidders are nancially constrained and there is a resale market, the second price auction remains a simple and appropriate mechanism to achieve an e¢ cient allocation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider cooperation situations where players have network relations. Networks evolve according to a stationary transition probability matrix and at each moment in time players receive payoffs from a stationary allocation rule. Players discount the future by a common factor. The pair formed by an allocation rule and a transition probability matrix is called a forward-looking network formation scheme if, first, the probability that a link is created is positive if the discounted, expected gains to its two participants are positive, and if, second, the probability that a link is eliminated is positive if the discounted, expected gains to at least one of its two participants are positive. The main result is the existence, for all discount factors and all value functions, of a forward-looking network formation scheme. Furthermore, we can always nd a forward-looking network formation scheme such that (i) the allocation rule is component balanced and (ii) the transition probabilities increase in the di erence in payo s for the corresponding players responsible for the transition. We use this dynamic solution concept to explore the tension between e ciency and stability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we extend to the multistage case two recent risk averse measures for two-stage stochastic programs based on first- and second-order stochastic dominance constraints induced by mixed-integer linear recourse. Additionally, we consider Time Stochastic Dominance (TSD) along a given horizon. Given the dimensions of medium-sized problems augmented by the new variables and constraints required by those risk measures, it is unrealistic to solve the problem up to optimality by plain use of MIP solvers in a reasonable computing time, at least. Instead of it, decomposition algorithms of some type should be used. We present an extension of our Branch-and-Fix Coordination algorithm, so named BFC-TSD, where a special treatment is given to cross scenario group constraints that link variables from different scenario groups. A broad computational experience is presented by comparing the risk neutral approach and the tested risk averse strategies. The performance of the new version of the BFC algorithm versus the plain use of a state-of-the-artMIP solver is also reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we investigate if a small fraction of quarks and gluons, which escaped hadronization and survived as a uniformly spread perfect fluid, can play the role of both dark matter and dark energy. This fluid, as developed in [1], is characterized by two main parameters: beta, related to the amount of quarks and gluons which act as dark matter; and gamma, acting as the cosmological constant. We explore the feasibility of this model at cosmological scales using data from type Ia Supernovae (SNeIa), Long Gamma-Ray Bursts (LGRB) and direct observational Hubble data. We find that: (i) in general, beta cannot be constrained by SNeIa data nor by LGRB or H(z) data; (ii) gamma can be constrained quite well by all three data sets, contributing with approximate to 78% to the energy matter content; (iii) when a strong prior on (only) baryonic matter is assumed, the two parameters of the model are constrained successfully. (C) 2014 The Authors. Published by Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plant community ecologists use the null model approach to infer assembly processes from observed patterns of species co-occurrence. In about a third of published studies, the null hypothesis of random assembly cannot be rejected. When this occurs, plant ecologists interpret that the observed random pattern is not environmentally constrained - but probably generated by stochastic processes. The null model approach (using the C-score and the discrepancy index) was used to test for random assembly under two simulation algorithms. Logistic regression, distance-based redundancy analysis, and constrained ordination were used to test for environmental determinism (species segregation along environmental gradients or turnover and species aggregation). This article introduces an environmentally determined community of alpine hydrophytes that presents itself as randomly assembled. The pathway through which the random pattern arises in this community is suggested to be as follows: Two simultaneous environmental processes, one leading to species aggregation and the other leading to species segregation, concurrently generate the observed pattern, which results to be neither aggregated nor segregated - but random. A simulation study supports this suggestion. Although apparently simple, the null model approach seems to assume that a single ecological factor prevails or that if several factors decisively influence the community, then they all exert their influence in the same direction, generating either aggregation or segregation. As these assumptions are unlikely to hold in most cases and assembly processes cannot be inferred from random patterns, we would like to propose plant ecologists to investigate specifically the ecological processes responsible for observed random patterns, instead of trying to infer processes from patterns