88 resultados para Selection index
Resumo:
In this paper, we present a matching model with adverse selection that explains why flows into and out of unemployment are much lower in Europe compared to North America, while employment-to-employment flows are similar in the two continents. In the model,firms use discretion in terms of whom to fire and, thus, low quality workers are more likely to be dismissed than high quality workers. Moreover, as hiring and firing costs increase, firms find it more costly to hire a bad worker and, thus, they prefer to hire out of the pool of employed job seekers rather than out of the pool of the unemployed, who are more likely to turn out to be 'lemons'. We use microdata for Spain and the U.S. and find that the ratio of the job finding probability of the unemployed to the job finding probability of employed job seekers was smaller in Spain than in the U.S. Furthermore, using U.S. data, we find that the discrimination of the unemployed increased over the 1980's in those states that raised firing costs by introducing exceptions to the employment-at-will doctrine.
Resumo:
This paper argues that the strategic use of debt favours the revelationof information in dynamic adverse selection problems. Our argument is basedon the idea that debt is a credible commitment to end long term relationships.Consequently, debt encourages a privately informed party to disclose itsinformation at early stages of a relationship. We illustrate our pointwith the financing decision of a monopolist selling a good to a buyerwhose valuation is private information. A high level of (renegotiable)debt, by increasing the scope for liquidation, may induce the highvaluation buyer to buy early at a high price and thus increase themonopolist's expected payoff. By affecting the buyer's strategy, it mayreduce the probability of excessive liquidation. We investigate theconsequences of good durability and we examine the way debt mayalleviate the ratchet effect.
Resumo:
That individuals contribute in social dilemma interactions even when contributing is costly is a well-established observation in the experimental literature. Since a contributor is always strictly worse off than a non-contributor the question is raised if an intrinsic motivation to contribute can survive in an evolutionary setting. Using recent results on deterministic approximation of stochastic evolutionary dynamics we give conditions for equilibria with a positive number of contributors to be selected in the long run.
Resumo:
We perform an experiment on a pure coordination game with uncertaintyabout the payoffs. Our game is closely related to models that have beenused in many macroeconomic and financial applications to solve problemsof equilibrium indeterminacy. In our experiment each subject receives anoisy signal about the true payoffs. This game has a unique strategyprofile that survives the iterative deletion of strictly dominatedstrategies (thus a unique Nash equilibrium). The equilibrium outcomecoincides, on average, with the risk-dominant equilibrium outcome ofthe underlying coordination game. The behavior of the subjects convergesto the theoretical prediction after enough experience has been gained. The data (and the comments) suggest that subjects do not apply through"a priori" reasoning the iterated deletion of dominated strategies.Instead, they adapt to the responses of other players. Thus, the lengthof the learning phase clearly varies for the different signals. We alsotest behavior in a game without uncertainty as a benchmark case. The gamewith uncertainty is inspired by the "global" games of Carlsson and VanDamme (1993).
Resumo:
It has long been standard in agency theory to search for incentive-compatible mechanisms on the assumption that people care only about their own material wealth. However, this assumption is clearly refuted by numerous experiments, and we feel that it may be useful to consider nonpecuniary utility in mechanism design and contract theory. Accordingly, we devise an experiment to explore optimal contracts in an adverse-selection context. A principal proposes one of three contract menus, each of which offers a choice of two incentive-compatible contracts, to two agents whose types are unknown to the principal. The agents know the set of possible menus, and choose to either accept one of the two contracts offered in the proposed menu or to reject the menu altogether; a rejection by either agent leads to lower (and equal) reservation payoffs for all parties. While all three possible menus favor the principal, they do so to varying degrees. We observe numerous rejections of the more lopsided menus, and approach an equilibrium where one of the more equitable contract menus (which one depends on the reservation payoffs) is proposed and agents accept a contract, selecting actions according to their types. Behavior is largely consistent with all recent models of social preferences, strongly suggesting there is value in considering nonpecuniary utility in agency theory.
Resumo:
Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.
Resumo:
We develop a mathematical programming approach for the classicalPSPACE - hard restless bandit problem in stochastic optimization.We introduce a hierarchy of n (where n is the number of bandits)increasingly stronger linear programming relaxations, the lastof which is exact and corresponds to the (exponential size)formulation of the problem as a Markov decision chain, while theother relaxations provide bounds and are efficiently computed. Wealso propose a priority-index heuristic scheduling policy fromthe solution to the first-order relaxation, where the indices aredefined in terms of optimal dual variables. In this way wepropose a policy and a suboptimality guarantee. We report resultsof computational experiments that suggest that the proposedheuristic policy is nearly optimal. Moreover, the second-orderrelaxation is found to provide strong bounds on the optimalvalue.
Resumo:
It is shown that preferences can be constructed from observed choice behavior in a way that is robust to indifferent selection (i.e., the agent is indifferent between two alternatives but, nevertheless, is only observed selecting one of them). More precisely, a suggestion by Savage (1954) to reveal indifferent selection by considering small monetary perturbations of alternatives is formalized and generalized to a purely topological framework: references over an arbitrary topological space can be uniquely derived from observed behavior under the assumptions that they are continuous and nonsatiated and that a strictly preferred alternative is always chosen, and indifferent selection is then characterized by discontinuity in choice behavior. Two particular cases are then analyzed: monotonic preferences over a partially ordered set, and preferences representable by a continuous pseudo-utility function.
Resumo:
This paper characterizes the relationship between entrepreneurial wealth and aggregate investmentunder adverse selection. Its main finding is that such a relationship need not bemonotonic. In particular, three results emerge from the analysis: (i) pooling equilibria, in whichinvestment is independent of entrepreneurial wealth, are more likely to arise when entrepreneurialwealth is relatively low; (ii) separating equilibria, in which investment is increasing inentrepreneurial wealth, are most likely to arise when entrepreneurial wealth is relatively highand; (iii) for a given interest rate, an increase in entrepreneurial wealth may generate a discontinuousfall in investment.
Resumo:
We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical {\sc vc} dimension, empirical {\sc vc} entropy, andmargin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.
Resumo:
The problem of obesity is alarming public health authorities around the world. Therefore, it is important to study its determinants. In this paper we explore the empirical relationship between household income and body mass index (BMI) in nine European Union countries. Our findings suggest that the association is negative for women, but we find no statistically significant relationship for men. However, we show that the different relationship for men and women appears to be driven by the negative relationship for women between BMI and individual income from work. We tentatively conclude that the negative relationship between household income and BMI for women may simply be capturing the wage penalty that obese women suffer in the labor market.
Resumo:
En aquest estat de la qüestió, s’hi presenten els resultats d’una anàlisi sobre l’evolució i les característiques principals de les revistes de geografia incloses al Journal of Citation Reports dins de la versió del Social Science Citation Index i, per tant, amb factor d’impacte. El període d’estudi escollit ha estat el que va de 1997 a 2005, és a dir, al llarg dels darrers nou anys amb dades disponibles. En total, hi han aparegut incloses trenta-nou revistes, una bona part de les quals ha romàs a la llista durant tot el temps estudiat. Hi ha hagut deu publicacions que han estat situades entre les cinc amb més factor d’impacte de cada any, i cap no ha estat la primera més de dos anys seguits. S’han trobat divuit temàtiques diferents en el conjunt de les revistes, en destaquen les de caire generalista i les de geografia econòmica i regional. Una gran majoria dels volums està publicada per editorials, Blackwell Publishing n’és la més destacada. L’origen de les revistes és clarament anglosaxó, només n’hi trobem dues d’escrites en una altra llengua. La segona part de l’article descriu totes les publicacions contemplades en els nou anys estudiats, amb una petita ressenya de cadascuna
Resumo:
El Servei d'Avaluació, Seguiment i Selecció de l'ISPC han elaborat un estudi sobre el perfil de personalitat dels aspirants al Curs de Formació bàsica per policies, que es va presentar a l'International Society for the Study of Individual Differences Meeting celebrat al CosmoCaixa de Barcelona i que organitzen conjuntament l’Associació Iberoamericana per a la recerca de les diferències individuals i la Universitat de Barcelona. L’estudi, titulat Revised NEO Personality Inventory Normative Data for Catalan police officer selection: A preliminary study, té com a objectiu comparar els perfils de personalitat d’una mostra d’aspirants de l’ISPC amb els resultats d’una mostra d’aspirants a policia dels EUA, publicada en una revista científica de prestigi el mes de febrer passat. Els resultats mostren que els aspirants catalans destaquen per obtenir millors puntuacions en les dimensions de responsabilitat i amabilitat, cosa que indicaria que aquest tret es valora especialment durant el procés de selecció de la policia de Catalunya; en altres característiques de la personalitat les dues mostres obtenen resultats similars. Els trets característics del perfil del policia català seria el de persones estables emocionalment, poc impulsives, amb capacitat per gestionar l’estrés, orientades a les persones, agradables, sociables, responsables, disciplinades i cauteloses. Enllaç a: International Society for the Study of Individual Differences Meeting :http://www.issid.org/conferences/ISSID2013/ISSIDconference2013.html
Resumo:
Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g.,rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperaturedependent viscosity. Results show that average RMS errors are ∼5%–7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck’09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ∼10°C–15°C effluent reach ∼5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2‐D flows where background objects have a small spatial scale, such as sand or gravel
Resumo:
En el presente artículo se recoge una metodología para la valoración del impacto de la información en Internet, usando las capacidades de indización y recuperación del buscador Altavista. Se aprovecha el contexto para describir la función de los metaelementos del HTML como mecanismo de estructuración y ordenación de la información. Se discuten las limitaciones y fiabilidad del método y se exponen algunos datos que muestran la producción de páginas WWW a nivel de institución y a nivel nacional, así como su comparación con otros países europeos. Se hace especial hincapié en la posibilidad de medir el impacto de estas páginas en función de las veces que son 'enlazadas' desde páginas externas de manera similar a como funciona el 'Citation Index' del Institute for Scientific Information.