6 resultados para Credit Constraints

em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyzes auctions where bidders face nancial constraints that may force them to resell part of the property of the good (or subcontract part of a project) at a resale market. First we show that the ine¢ cient speculative equilibria of second- price auctions (Garratt and Tröger, 2006) generalizes to situations with partial resale where only the high value bidder is nancially constrained. However, when all players face nancial constraints the ine¢ cient speculative equilibria disappear. Therefore, for auctioning big facilities or contracts where all bidders are nancially constrained and there is a resale market, the second price auction remains a simple and appropriate mechanism to achieve an e¢ cient allocation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Loan mortgage interest rates are usually the result of a bank-customer negotiation process. Credit risk, consumer cross-buying potential, bundling, financial market competition and other features affecting the bargaining power of the parties could affect price. We argue that, since mortgage loan is a complex product, consumer expertise could be a relevant factor for mortgage pricing. Using data on mortgage loan prices for a sample of 1055 households for the year 2005 (Bank of Spain Survey of Household Finances, EFF-2005), and including credit risk, costs, potential capacity of the consumer to generate future business and bank competition variables, the regression results indicate that consumer expertise-related metrics are highly significant as predictors of mortgage loan prices. Other factors such as credit risk and consumer cross-buying potential do not have such a significant impact on mortgage prices. Our empirical results are affected by the credit conditions prior to the financial crisis and could shed some light on this issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we extend to the multistage case two recent risk averse measures for two-stage stochastic programs based on first- and second-order stochastic dominance constraints induced by mixed-integer linear recourse. Additionally, we consider Time Stochastic Dominance (TSD) along a given horizon. Given the dimensions of medium-sized problems augmented by the new variables and constraints required by those risk measures, it is unrealistic to solve the problem up to optimality by plain use of MIP solvers in a reasonable computing time, at least. Instead of it, decomposition algorithms of some type should be used. We present an extension of our Branch-and-Fix Coordination algorithm, so named BFC-TSD, where a special treatment is given to cross scenario group constraints that link variables from different scenario groups. A broad computational experience is presented by comparing the risk neutral approach and the tested risk averse strategies. The performance of the new version of the BFC algorithm versus the plain use of a state-of-the-artMIP solver is also reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we investigate if a small fraction of quarks and gluons, which escaped hadronization and survived as a uniformly spread perfect fluid, can play the role of both dark matter and dark energy. This fluid, as developed in [1], is characterized by two main parameters: beta, related to the amount of quarks and gluons which act as dark matter; and gamma, acting as the cosmological constant. We explore the feasibility of this model at cosmological scales using data from type Ia Supernovae (SNeIa), Long Gamma-Ray Bursts (LGRB) and direct observational Hubble data. We find that: (i) in general, beta cannot be constrained by SNeIa data nor by LGRB or H(z) data; (ii) gamma can be constrained quite well by all three data sets, contributing with approximate to 78% to the energy matter content; (iii) when a strong prior on (only) baryonic matter is assumed, the two parameters of the model are constrained successfully. (C) 2014 The Authors. Published by Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plant community ecologists use the null model approach to infer assembly processes from observed patterns of species co-occurrence. In about a third of published studies, the null hypothesis of random assembly cannot be rejected. When this occurs, plant ecologists interpret that the observed random pattern is not environmentally constrained - but probably generated by stochastic processes. The null model approach (using the C-score and the discrepancy index) was used to test for random assembly under two simulation algorithms. Logistic regression, distance-based redundancy analysis, and constrained ordination were used to test for environmental determinism (species segregation along environmental gradients or turnover and species aggregation). This article introduces an environmentally determined community of alpine hydrophytes that presents itself as randomly assembled. The pathway through which the random pattern arises in this community is suggested to be as follows: Two simultaneous environmental processes, one leading to species aggregation and the other leading to species segregation, concurrently generate the observed pattern, which results to be neither aggregated nor segregated - but random. A simulation study supports this suggestion. Although apparently simple, the null model approach seems to assume that a single ecological factor prevails or that if several factors decisively influence the community, then they all exert their influence in the same direction, generating either aggregation or segregation. As these assumptions are unlikely to hold in most cases and assembly processes cannot be inferred from random patterns, we would like to propose plant ecologists to investigate specifically the ecological processes responsible for observed random patterns, instead of trying to infer processes from patterns