833 resultados para Optimal timing
Resumo:
Este trabalho tem como objetivo estudar e avaliar técnicas para a aceleração de algoritmos de análise de timing funcional (FTA - Functional Timing Analysis) baseados em geração automática de testes (ATPG – Automatic Test Generation). Para tanto, são abordados três algoritmos conhecidos : algoritmo-D, o PODEM e o FAN. Após a análise dos algoritmos e o estudo de algumas técnicas de aceleração, é proposto o algoritmo DETA (Delay Enumeration-Based Timing Analysis) que determina o atraso crítico de circuitos que contêm portas complexas. O DETA está definido como um algoritmo baseado em ATPG com sensibilização concorrente de caminhos. Na implementação do algoritmo, foi possível validar o modelo de computação de atrasos para circuitos que contêm portas complexas utilizando a abordagem de macro-expansão implícita. Além disso, alguns resultados parciais demonstram que, para alguns circuitos, o DETA apresenta uma pequena dependência do número de entradas quando comparado com a dependência no procedimento de simulação. Desta forma, é possível evitar uma pesquisa extensa antes de se encontrar o teste e assim, obter sucesso na aplicação de métodos para aceleração do algoritmo.
Resumo:
We characterize the optimal auction in an independent private values framework for a completely general distribution of valuations. We do this introducing a new concept: the generalized virtual valuation. To show the wider applicability of this concept we present two examples showing how to extend the classical models of Mussa and Rosen and Baron and Myerson for arbitrary distributions
Resumo:
We develop a theory of public versus private ownership based on value diversion by managers. Government is assumed to face stronger institutional constraints than has been assumed in previous literature. The model which emerges from these assumptions is fexible and has wide application. We provide amapping between the qualitative characteristics of an asset, its main use - including public goods characteristics, and spillovers toother assets values - and the optimal ownership and management regime. The model is applied to single and multiple related assets. We address questions such as; when is it optimal to have one of a pair ofr elated assets public and the other private; when is joint management desirable; and when should a public asset be managed by the owner of a related private asset? We show that while private ownership can be judged optimal in some cases solely on the basis of qualitative information, the optimality of any other ownership and management regimes relies on quantitative analysis. Our results reveal the situations in which policy makers will have difficulty in determining the opimal regime.
Resumo:
This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.
Resumo:
The purpose of this work is to provide a brief overview of the literature on the optimal design of unemployment insurance systems by analyzing some of the most influential articles published over the last three decades on the subject and extend the main results to a multiple aggregate shocks environment. The properties of optimal contracts are discussed in light of the key assumptions commonly made in theoretical publications on the area. Moreover, the implications of relaxing each of these hypothesis is reckoned as well. The analysis of models of only one unemployment spell starts from the seminal work of Shavell and Weiss (1979). In a simple and common setting, unemployment benefits policies, wage taxes and search effort assignments are covered. Further, the idea that the UI distortion of the relative price of leisure and consumption is the only explanation for the marginal incentives to search for a job is discussed, putting into question the reduction in labor supply caused by social insurance, usually interpreted as solely an evidence of a dynamic moral hazard caused by a substitution effect. In addition, the paper presents one characterization of optimal unemployment insurance contracts in environments in which workers experience multiple unemployment spells. Finally, an extension to multiple aggregate shocks environment is considered. The paper ends with a numerical analysis of the implications of i.i.d. shocks to the optimal unemployment insurance mechanism.
Resumo:
Neste trabalho é analisada a relação entre um regulador e uma empresa petrolífera. Há várias incertezas inerentes à essa relação e o trabalho se concentra nos efeitos da assimetria de informação. Fazemos a caracterização da regulação ótima sob informação assimétrica, quando o regulador deve desenhar um mecanismo que induz a firma a revelar corretamente sua informação privada. No caso em que a rma não pode se comprometer a não romper o acordo, mostramos que o regulador pode não implementar o resultado ótimo que é obtido sob informação completa. Nesse caso, o regulador não consegue compartilhar os riscos com a firma de forma ótima. Por fim, é apresentado um exemplo, em que mostramos que a condição de Spence-Mirrlees (SMC) pode não valer. Esse resultado aparece de forma natural no modelo.
Resumo:
This paper considers tests which maximize the weighted average power (WAP). The focus is on determining WAP tests subject to an uncountable number of equalities and/or inequalities. The unifying theory allows us to obtain tests with correct size, similar tests, and unbiased tests, among others. A WAP test may be randomized and its characterization is not always possible. We show how to approximate the power of the optimal test by sequences of nonrandomized tests. Two alternative approximations are considered. The rst approach considers a sequence of similar tests for an increasing number of boundary conditions. This discretization allows us to implement the WAP tests in practice. The second method nds a sequence of tests which approximate the WAP test uniformly. This approximation allows us to show that WAP similar tests are admissible. The theoretical framework is readily applicable to several econometric models, including the important class of the curved-exponential family. In this paper, we consider the instrumental variable model with heteroskedastic and autocorrelated errors (HAC-IV) and the nearly integrated regressor model. In both models, we nd WAP similar and (locally) unbiased tests which dominate other available tests.
Resumo:
I show that when a central bank is financially independent from the treasury and has balance sheet concerns, an increase in the size or a change in the composition of the central bank's balance sheet (quantitative easing) can serve as a commitment device in a liquidity trap scenario. In particular, when the short-term interest rate is up against the zero lower bound, an open market operation by the central bank that involves purchases of long-term bonds can help mitigate the deation and a large negative output gap under a discretionary equilibrium. This is because such an open market operation provides an incentive to the central bank to keep interest rates low in future in order to avoid losses in its balance sheet.
Resumo:
This paper aims at contributing to the research agenda on the sources of price stickiness, showing that the adoption of nominal price rigidity may be an optimal firms' reaction to the consumers' behavior, even if firms have no adjustment costs. With regular broadly accepted assumptions on economic agents behavior, we show that firms' competition can lead to the adoption of sticky prices as an (sub-game perfect) equilibrium strategy. We introduce the concept of a consumption centers model economy in which there are several complete markets. Moreover, we weaken some traditional assumptions used in standard monetary policy models, by assuming that households have imperfect information about the ineflicient time-varying cost shocks faced by the firms, e.g. the ones regarding to inefficient equilibrium output leveIs under fiexible prices. Moreover, the timing of events are assumed in such a way that, at every period, consumers have access to the actual prices prevailing in the market only after choosing a particular consumption center. Since such choices under uncertainty may decrease the expected utilities of risk averse consumers, competitive firms adopt some degree of price stickiness in order to minimize the price uncertainty and fi attract more customers fi.'
Resumo:
This paper presents optimal rules for monetary policy in Brazil derived from a backward looking expectation model consisting of a Keynesian IS function and an Augmented Phillips Curve (ISAS). The IS function displays'a high sensitivity of aggregate demand to the real interest rate and the Phillips Curve is accelerationist. The optimal monetary rules show low interest rate volatility with reaction coefficients lower than the ones suggested by Taylor (1993a,b). Reaction functions estimated through ADL and SUR models suggest that monetary policy has not been optimal and has aimed to product rather than inflation stabilization.
Resumo:
This paper examines the output losses caused by disinflation and the role of credibility in a model where pricing mIes are optimal and individual prices are rigid. Individual nominal rigidity is modeled as resulting from menu costs. The interaction between optimal pricing mIes and credibility is essential in determining the inflationary inertia. A continued period of high inflation generates an asymmetric distribution of price deviations, with more prices that are substantially lower than their desired leveIs than prices that are substantially higher than the optimal ones. When disinflation is not credible, inflationary inertia is engendered by this asymmetry: idiosyncratic shocks trigger more upward than downward adjustments. A perfect1y credible disinflation causes an immediate change of pricing rules which, by rendering the price deviation distribution less asymmetric, practically annihilates inflationary inertia. An implication of our model is that stabilization may be sucessful even when credibility is low, provided that it is preceded by a mechanism of price alignment. We also develop an analytical framework for analyzing imperfect credibility cases.
Resumo:
This paper examines the relevance of market timing as a motive for initial public offerings (IPOs) by comparing IPOs of firms that are members of Japanese keiretsu industrial groups with IPOs of independent Japanese firms. We argue that Japanese keiretsu-linked IPOs form a favorable sample to find evidence of the market timing motive. Instead, the data provide strong evidence for a restructuring motive and little evidence for market timing. We find that long run returns to keiretsu and independent IPOs are not negative, contrary to U.S. evidence, and are indistinguishable from each other; initial returns to keiretsu-linked IPOs are significantly higher than to independent firms; and a significant number of keiretsu IPO firms adjust their linkages with the group following the IPO, with both increases and decreases.
Resumo:
Implementation and collapse of exchange rate pegging schemes are recur- rent events. A currency crisis (pegging) is usually followed by an economic downturn (boom). This essay explains why a benevolent government should pursue Þscal and monetary policies that lead to those recurrent currency crises and subsequent periods of pegging. It is shown that the optimal policy induces a competitive equilibrium that displays a boom in periods of below average de- valuation and a recession in periods of above average devaluation. A currency crisis (pegging) can be understood as an optimal policy answer to a recession (boom).
Resumo:
We characterize optimal policy in a two-sector growth model with xed coeÆcients and with no discounting. The model is a specialization to a single type of machine of a general vintage capital model originally formulated by Robinson, Solow and Srinivasan, and its simplicity is not mirrored in its rich dynamics, and which seem to have been missed in earlier work. Our results are obtained by viewing the model as a specific instance of the general theory of resource allocation as initiated originally by Ramsey and von Neumann and brought to completion by McKenzie. In addition to the more recent literature on chaotic dynamics, we relate our results to the older literature on optimal growth with one state variable: speci cally, to the one-sector setting of Ramsey, Cass and Koopmans, as well as to the two-sector setting of Srinivasan and Uzawa. The analysis is purely geometric, and from a methodological point of view, our work can be seen as an argument, at least in part, for the rehabilitation of geometric methods as an engine of analysis.