191 resultados para Market selection
Resumo:
In this paper, we use a unique long-run dataset of regulatory constraints on capital account openness to explain stock market correlations. Since stock returns themselves are highly volatile, any examination of what drives correlations needs to focus on long runs of data. This is particularly true since some of the short-term changes in co-movements appear to reverse themselves (Delroy Hunter 2005). We argue that changes in the co-movement of indices have not been random. Rather, they are mainly driven by greater freedom to move funds from one country to another. In related work, Geert Bekaert and Campbell Harvey (2000) show that equity correlations increase after liberalization of capital markets, using a number of case studies from emerging countries. We examine this pattern systematically for the last century, and find it to be most pronounced in the recent past. We compare the importance of capital account openness with one main alternative explanation, the growing synchronization of economic fundamentals. We conclude that greater openness has been the single most important cause of growing correlations during the last quarter of a century, though increasingly correlated economic fundamentals also matter. In the conclusion, we offer some thoughts on why the effects of greater openness appear to be so much stronger today than they were during the last era of globalization before 1914.
Resumo:
Excess entry or the high failure rate of market-entry decisions is often attributed tooverconfidence exhibited by entreprene urs. We show analytically that whereas excess entryis an inevitable consequence of imperfect assessments of entrepreneurial skill, it does notimply overconfidence. Judgmental fallibility leads to excess entry even when everyone isunderconfident. Self-selection implies greater confidence (but not necessarilyoverconfidence) among those who start new businesses than those who do not and amongsuccessful entrants than failures. Our results question claims that entrepreneurs areoverconfident and emphasize the need to understand the role of judgmental fallibility inproducing economic outcomes.
Resumo:
We argue that during the crystallization of common and civil law in the 19th century, the optimal degree of discretion in judicial rulemaking, albeit influenced by the comparative advantages of both legislative and judicial rulemaking, was mainly determined by the anti-market biases of the judiciary. The different degrees of judicial discretion adopted in both legal traditions were thus optimally adapted to different circumstances, mainly rooted in the unique, market-friendly, evolutionary transition enjoyed by English common law as opposed to the revolutionary environment of the civil law. On the Continent, constraining judicial discretion was essential for enforcing freedom of contract and establishing a market economy. The ongoing debasement of pro-market fundamentals in both branches of the Western legal system is explained from this perspective as a consequence of increased perceptions of exogenous risks and changes in the political system, which favored the adoption of sharing solutions and removed the cognitive advantage of parliaments and political leaders.
Resumo:
In this paper, we present a matching model with adverse selection that explains why flows into and out of unemployment are much lower in Europe compared to North America, while employment-to-employment flows are similar in the two continents. In the model,firms use discretion in terms of whom to fire and, thus, low quality workers are more likely to be dismissed than high quality workers. Moreover, as hiring and firing costs increase, firms find it more costly to hire a bad worker and, thus, they prefer to hire out of the pool of employed job seekers rather than out of the pool of the unemployed, who are more likely to turn out to be 'lemons'. We use microdata for Spain and the U.S. and find that the ratio of the job finding probability of the unemployed to the job finding probability of employed job seekers was smaller in Spain than in the U.S. Furthermore, using U.S. data, we find that the discrimination of the unemployed increased over the 1980's in those states that raised firing costs by introducing exceptions to the employment-at-will doctrine.
Resumo:
This paper applies the theoretical literature on nonparametric bounds ontreatment effects to the estimation of how limited English proficiency (LEP)affects wages and employment opportunities for Hispanic workers in theUnited States. I analyze the identifying power of several weak assumptionson treatment response and selection, and stress the interactions between LEPand education, occupation and immigration status. I show that thecombination of two weak but credible assumptions provides informative upperbounds on the returns to language skills for certain subgroups of thepopulation. Adding age at arrival as a monotone instrumental variable alsoprovides informative lower bounds.
Resumo:
This paper argues that the strategic use of debt favours the revelationof information in dynamic adverse selection problems. Our argument is basedon the idea that debt is a credible commitment to end long term relationships.Consequently, debt encourages a privately informed party to disclose itsinformation at early stages of a relationship. We illustrate our pointwith the financing decision of a monopolist selling a good to a buyerwhose valuation is private information. A high level of (renegotiable)debt, by increasing the scope for liquidation, may induce the highvaluation buyer to buy early at a high price and thus increase themonopolist's expected payoff. By affecting the buyer's strategy, it mayreduce the probability of excessive liquidation. We investigate theconsequences of good durability and we examine the way debt mayalleviate the ratchet effect.
Resumo:
We analyse credit market equilibrium when banks screen loan applicants. When banks have a convex cost function of screening, a pure strategy equilibrium exists where banks optimally set interest rates at the same level as their competitors. This result complements Broecker s (1990) analysis, where he demonstrates that no pure strategy equilibrium exists when banks have zero screening costs. In our set up we show that interest rate on loansare largely independent of marginal costs, a feature consistent with the extant empirical evidence. In equilibrium, banks make positive profits in our model in spite of the threat of entry by inactive banks. Moreover, an increase in the number of active banks increases credit risk and so does not improve credit market effciency: this point has important regulatory implications. Finally, we extend our analysis to the case where banks havediffering screening abilities.
Resumo:
That individuals contribute in social dilemma interactions even when contributing is costly is a well-established observation in the experimental literature. Since a contributor is always strictly worse off than a non-contributor the question is raised if an intrinsic motivation to contribute can survive in an evolutionary setting. Using recent results on deterministic approximation of stochastic evolutionary dynamics we give conditions for equilibria with a positive number of contributors to be selected in the long run.
Resumo:
We analyze the political support for employment protection legislation.Unlike my previous work on the same topic, this paper pays a lot ofattention to the role of obsolescence in the growth process.In voting in favour of employment protection, incumbent employeestrade off lower living standards (because employment protectionmaintains workers in less productive activities) against longer jobduration. The support for employment protection will then depend onthe value of the latter relative to the cost of the former. Wehighlight two key deeterminants of this trade-off: first, the workers'bargaining power, second, the economy's growth rate-more preciselyits rate of creative destruction.
Resumo:
We perform an experiment on a pure coordination game with uncertaintyabout the payoffs. Our game is closely related to models that have beenused in many macroeconomic and financial applications to solve problemsof equilibrium indeterminacy. In our experiment each subject receives anoisy signal about the true payoffs. This game has a unique strategyprofile that survives the iterative deletion of strictly dominatedstrategies (thus a unique Nash equilibrium). The equilibrium outcomecoincides, on average, with the risk-dominant equilibrium outcome ofthe underlying coordination game. The behavior of the subjects convergesto the theoretical prediction after enough experience has been gained. The data (and the comments) suggest that subjects do not apply through"a priori" reasoning the iterated deletion of dominated strategies.Instead, they adapt to the responses of other players. Thus, the lengthof the learning phase clearly varies for the different signals. We alsotest behavior in a game without uncertainty as a benchmark case. The gamewith uncertainty is inspired by the "global" games of Carlsson and VanDamme (1993).
Resumo:
It has long been standard in agency theory to search for incentive-compatible mechanisms on the assumption that people care only about their own material wealth. However, this assumption is clearly refuted by numerous experiments, and we feel that it may be useful to consider nonpecuniary utility in mechanism design and contract theory. Accordingly, we devise an experiment to explore optimal contracts in an adverse-selection context. A principal proposes one of three contract menus, each of which offers a choice of two incentive-compatible contracts, to two agents whose types are unknown to the principal. The agents know the set of possible menus, and choose to either accept one of the two contracts offered in the proposed menu or to reject the menu altogether; a rejection by either agent leads to lower (and equal) reservation payoffs for all parties. While all three possible menus favor the principal, they do so to varying degrees. We observe numerous rejections of the more lopsided menus, and approach an equilibrium where one of the more equitable contract menus (which one depends on the reservation payoffs) is proposed and agents accept a contract, selecting actions according to their types. Behavior is largely consistent with all recent models of social preferences, strongly suggesting there is value in considering nonpecuniary utility in agency theory.
Resumo:
This paper deals whit the dynamics of the Catalan textile labour market (theSpanish region that concentrated most of the industrial and factory activity duringthe 19 Century) and offers hypotheses and results on the impact it had on livingstandards and fertility levels. We observe the formation of an uneven labourmarket in which male supply for labour (excluding women and children) grewmuch faster than the demand. We stress the fact that labour supply is verydependant on institutional factors liked to the transmition of household propertybetween generations. Instead the slow path of growth of adult males demand forlabour is witnessing the limits of this industry to expand and to compete ininternational markets. The strategy of working class families to adapt to scarceopportunities of employment we document here is the diminution of legitimatefertility levels. Fertility control is the direct instrument we think workers have tocontrol their number in a situation that was likely to create labour surpluses in theshort and mid run.
Resumo:
We consider the dynamic relationship between product market entry regulation and equilibrium unemployment. The main theoretical contribution is combining a job matchingmodel with monopolistic competition in the goods market and individual wage bargaining.Product market competition affects unemployment by two channels: the output expansion effect and a countervailing effect due to a hiring externality. Competition is then linked to barriers to entry. We calibrate the model to US data and perform a policy experiment to assess whether the decrease in trend unemployment during the 1980 s and 1990 s could be attributed to product market deregulation. Our quantitative analysis suggests that under individual bargaining, a decrease of less than two tenths of a percentage point of unemployment rates can be attributed to product market deregulation, a surprisingly small amount.
Resumo:
We study financial markets in which both rational and overconfident agents coexist and make endogenous information acquisition decisions. We demonstrate the following irrelevance result: when a positive fraction of rational agents (endogeneously) decides to become informed in equilibrium, prices are set as if all investors were rational, and as a consequence the overconfidence bias does not aect informational efficiency, price volatility, rational traders expected profits or their welfare. Intuitively, as overconfidence goes up, so does price infornativeness, which makes rational agents cut their information acquisition activities, effectively undoing the standard effect of more aggressive trading by the overconfident.
Resumo:
Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.