927 resultados para Endogenous Information Structure
Resumo:
Royal palm tree peroxidase (RPTP) is a very stable enzyme in regards to acidity, temperature, H(2)O(2), and organic solvents. Thus, RPTP is a promising candidate for developing H(2)O(2)-sensitive biosensors for diverse applications in industry and analytical chemistry. RPTP belongs to the family of class III secretory plant peroxidases, which include horseradish peroxidase isozyme C, soybean and peanut peroxidases. Here we report the X-ray structure of native RPTP isolated from royal palm tree (Roystonea regia) refined to a resolution of 1.85 angstrom. RPTP has the same overall folding pattern of the plant peroxidase superfamily, and it contains one heme group and two calcium-binding sites in similar locations. The three-dimensional structure of RPTP was solved for a hydroperoxide complex state, and it revealed a bound 2-(N-morpholino) ethanesulfonic acid molecule (MES) positioned at a putative substrate-binding secondary site. Nine N-glycosylation sites are clearly defined in the RPTP electron-density maps, revealing for the first time conformations of the glycan chains of this highly glycosylated enzyme. Furthermore, statistical coupling analysis (SCA) of the plant peroxidase superfamily was performed. This sequence-based method identified a set of evolutionarily conserved sites that mapped to regions surrounding the heme prosthetic group. The SCA matrix also predicted a set of energetically coupled residues that are involved in the maintenance of the structural folding of plant peroxidases. The combination of crystallographic data and SCA analysis provides information about the key structural elements that could contribute to explaining the unique stability of RPTP. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The Raman band assigned to the nu(C=O)mode in N,N-dimethylformamide (at ca. 1660 cm(-1)) was used as a probe to study a group of ionic liquids 1-alkyl-3-methylimidazolium bromide ([C(n)Mlm]Br) with different alkyl groups (n = 2, 4, 6, 8 and 10 carbons) in binary equimolar binary mixtures with dimethylformamide. Due to the high electric dipole moment of the group C=O, there is a substantial coupling between adjacent molecules in the solution, and the corresponding Raman band involves both vibrational and reorientational modes. Different chain lengths of the ILs lead to different extents of the uncoupling of adjacent molecules of dimethylformamide, resulting in different shifts for this band in the mixtures. Information about the organization of ionic liquids in solution was obtained and a model of aggregation for these systems is proposed. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The Short-term Water Information and Forecasting Tools (SWIFT) is a suite of tools for flood and short-term streamflow forecasting, consisting of a collection of hydrologic model components and utilities. Catchments are modeled using conceptual subareas and a node-link structure for channel routing. The tools comprise modules for calibration, model state updating, output error correction, ensemble runs and data assimilation. Given the combinatorial nature of the modelling experiments and the sub-daily time steps typically used for simulations, the volume of model configurations and time series data is substantial and its management is not trivial. SWIFT is currently used mostly for research purposes but has also been used operationally, with intersecting but significantly different requirements. Early versions of SWIFT used mostly ad-hoc text files handled via Fortran code, with limited use of netCDF for time series data. The configuration and data handling modules have since been redesigned. The model configuration now follows a design where the data model is decoupled from the on-disk persistence mechanism. For research purposes the preferred on-disk format is JSON, to leverage numerous software libraries in a variety of languages, while retaining the legacy option of custom tab-separated text formats when it is a preferred access arrangement for the researcher. By decoupling data model and data persistence, it is much easier to interchangeably use for instance relational databases to provide stricter provenance and audit trail capabilities in an operational flood forecasting context. For the time series data, given the volume and required throughput, text based formats are usually inadequate. A schema derived from CF conventions has been designed to efficiently handle time series for SWIFT.
Resumo:
Standard models of moral hazard predict a negative relationship between risk and incentives, but the empirical work has not confirmed this prediction. In this paper, we propose a model with adverse selection followed by moral hazard, where effort and the degree of risk aversion are private information of an agent who can control the mean and the variance of profits. For a given contract, more risk-averse agents suppIy more effort in risk reduction. If the marginal utility of incentives decreases with risk aversion, more risk-averse agents prefer lower-incentive contractsj thus, in the optimal contract, incentives are positively correlated with endogenous risk. In contrast, if risk aversion is high enough, the possibility of reduction in risk makes the marginal utility of incentives increasing in risk aversion and, in this case, risk and incentives are negatively related.
Resumo:
The initial endogenous growth models emphasized the importance of externaI effects in explaining sustainable growth across time. Empirically, this hypothesis can be confirmed if the coefficient of physical capital per hour is unity in the aggregate production function. Although cross-section results concur with theory, previous estimates using time series data rejected this hypothesis, showing a small coefficient far from unity. It seems that the problem lies not with the theory but with the techniques employed, which are unable to capture low frequency movements in high frequency data. This paper uses cointegration - a technique designed to capture the existence of long-run relationships in multivariate time series - to test the externalities hypothesis of endogenous growth. The results confirm the theory' and conform to previous cross-section estimates. We show that there is long-run proportionality between output per hour and a measure of capital per hour. U sing this result, we confmn the hypothesis that the implied Solow residual can be explained by government expenditures on infra-structure, which suggests a supply side role for government affecting productivity and a decrease on the extent that the Solow residual explains the variation of output.
Resumo:
This paper analyses the equilibrium structure of protection in Mercosul, developing empirical analyses based on the literature ensuing from the sequence of models set forth by Grossman and Helpman since 1994. Not only Mercosul’s common external tariff (CET) may be explained under a political economy perspective, but the existence of deviations, both at the level of the external tariffs and at that of the internal ones, make it interesting to contrast several structures under this approach. Different general equilibrium frameworks, in which governments are concerned with campaign contributions and with the welfare of the average voter, while organized special-interest groups care only about the welfare of their members, are used as the theoretical basis of the empirical tests. We build a single equation for explaining the CET and two fourequations systems (one equation for each member) for explaining deviations from the CET and from the internal free trade between members. The results (at the two-digit level) shed an interesting light on the sectoral dynamics of protection in each country; notably, Brazil seems to fit in better in the model framework, followed by Uruguay. In the case of the CET, and of deviations from it, the interaction between the domestic lobbies in the four countries plays a major role. There is also suggestion that the lobby structure that bid for deviations, be they internal or external, differs from the one which bid for the CET.
Resumo:
vVe examine the problem of a buyer who wishes to purehase and eombine ti. objeets owned by n individual owners to realize a higher V'illue. The owners are able to delay their entry into the sale proeess: They ean either seU now 01' seU later. Among other assumptions, the simple assumptions of compef'if'irnl, · .. that the presenee of more owners at point of sale reduees their surplus .. · and di..,(Jyun,fúl,g lead to interesting results: There is eostly delay in equilibdum. rvIoreover, with suffidently strong eompetition, the probability of delay inereases with n. Thus, buyers who diseount the future \\i11 faee inereased eosts as the number of owners inereases. The souree of transaetions eosts is the owners' desire to dis-eoordinate in the presenee of eompetition. These eosts are unrelated to transaetions eosts eurrently identified in the literature, spedfieally those due to asymmetrie information, 01' publie goods problems where players impose negative externalities on eaeh other by under-eontributing.
Resumo:
In infinite horizon financial markets economies, competitive equilibria fail to exist if one does not impose restrictions on agents' trades that rule out Ponzi schemes. When there is limited commitment and collateral repossession is the unique default punishment, Araujo, Páscoa and Torres-Martínez (2002) proved that Ponzi schemes are ruled out without imposing any exogenous/endogenous debt constraints on agents' trades. Recently Páscoa and Seghir (2009) have shown that this positive result is not robust to the presence of additional default punishments. They provide several examples showing that, in the absence of debt constraints, harsh default penalties may induce agents to run Ponzi schemes that jeopardize equilibrium existence. The objective of this paper is to close a theoretical gap in the literature by identifying endogenous borrowing constraints that rule out Ponzi schemes and ensure existence of equilibria in a model with limited commitment and (possible) default. We appropriately modify the definition of finitely effective debt constraints, introduced by Levine and Zame (1996) (see also Levine and Zame (2002)), to encompass models with limited commitment, default penalties and collateral. Along this line, we introduce in the setting of Araujo, Páscoa and Torres-Martínez (2002), Kubler and Schmedders (2003) and Páscoa and Seghir (2009) the concept of actions with finite equivalent payoffs. We show that, independently of the level of default penalties, restricting plans to have finite equivalent payoffs rules out Ponzi schemes and guarantees the existence of an equilibrium that is compatible with the minimal ability to borrow and lend that we expect in our model. An interesting feature of our debt constraints is that they give rise to budget sets that coincide with the standard budget sets of economies having a collateral structure but no penalties (as defined in Araujo, Páscoa and Torres-Martínez (2002)). This illustrates the hidden relation between finitely effective debt constraints and collateral requirements.
Resumo:
This paper develops a methodology for testing the term structure of volatility forecasts derived from stochastic volatility models, and implements it to analyze models of S&P500 index volatility. U sing measurements of the ability of volatility models to hedge and value term structure dependent option positions, we fmd that hedging tests support the Black-Scholes delta and gamma hedges, but not the simple vega hedge when there is no model of the term structure of volatility. With various models, it is difficult to improve on a simple gamma hedge assuming constant volatility. Ofthe volatility models, the GARCH components estimate of term structure is preferred. Valuation tests indicate that all the models contain term structure information not incorporated in market prices.
Resumo:
There is strong empirical evidence that risk premia in long-term interest rates are time-varying. These risk premia critically depend on interest rate volatility, yet existing research has not examined the im- pact of time-varying volatility on excess returns for long-term bonds. To address this issue, we incorporate interest rate option prices, which are very sensitive to interest rate volatility, into a dynamic model for the term structure of interest rates. We estimate three-factor affine term structure models using both swap rates and interest rate cap prices. When we incorporate option prices, the model better captures interest rate volatility and is better able to predict excess returns for long-term swaps over short-term swaps, both in- and out-of-sample. Our results indicate that interest rate options contain valuable infor- mation about risk premia and interest rate dynamics that cannot be extracted from interest rates alone.
Resumo:
This paper evaluates how information asymmetry affects the strength of competition in credit markets. A theory is presented in which adverse selection softens competition by decreasing the incentives creditors have for competing in the interest rate dimension. In equilibirum, although creditors compete, the outcome is similar to collusion. Three empirical implications arise. First, interest rate should respond asymmetrically to changes in the cost of funds: increases in cost of funds should, on average, have a larger effect on interest rates than decreases. Second, aggressiveness in pricing should be associated with a worseing in the bank level default rates. Third, bank level default rates should be endogenous. We then verify the validity of these three empirical implications using Brazilian data on consumer overdraft loans. The results in this paper rationalize seemingly abnormallly high interest rates in unsecured loans.
Resumo:
We study N-bidders, asymmetric all-pay auctions under incomplete information. First, we solve for the equilibrium of a parametric model. Each bidder’s valuation is independently drawn from an uniform [0, αi] where the parameter αi may vary across bidders. In this game, asymmetries are exogenously given. Next, a two-stage game where asymmetries are endogenously generated is studied. At the first stage, each bidder chooses the level of an observable, costly, value-enhancing action. The second stage is the bidding sub-game, whose equilibrium is simply the equilibrium of the, previously analyzed, game with exogenous asymmetries. Finally, natural applications of the all pay-auction in the context of political lobbying are considered: the effects of excluding bidders, as well as, the impact of caps on bids.
Resumo:
Incomplete markets and non-default borrowing constraints increase the volatility of pricing kernels and are helpful when addressing assetpricing puzzles. However, ruling out default when markets are in complete is suboptimal. This paper endogenizes borrowing constraints as an intertemporal incentive structure to default. It modeIs an infinitehorizon economy, where agents are allowed not to pay their liabilities and face borrowing constraints that depend on the individual history of default. Those constraints trade off the economy's risk-sharing possibilities and incentives to prevent default. The equilibrium presents stationary properties, such as an invariant distribution for the assets' solvency rate.
Resumo:
We analyze the stability of monetary regimes in a decentralized economy where fiat money is endogenously created, information about its value is imperfect, and agents only learn from their personal trading experiences. We show that in poorly informed economies, monetary stability depends heavily on the government's commitment to the long run value of money, whereas in economies where agents gather information more easily, monetary stability can be an endogenous outcome. We generate a dynamics on the acceptability of fiat money that resembles historical accounts of the rise and eventual colIapse of overissued paper money. Moreover, our results provide an explanation of the fact that, despite its obvious advantages, the widespread use of fiat money is a very recent development.
Resumo:
Na moderna Economia do Conhecimento, na Era do Big Data, entender corretamente o uso e a gestão da Tecnologia de Informação e Comunicação (TIC) tendo como base o campo acadêmico de estudos de Sistemas de Informação (SI), torna-se cada vez mais relevante e estratégico para as organizações que pretendem: permanecer em atividade, estar aptas para atender novas demandas (internas e externas) e enfrentar as complexas mudanças na competição de mercado. Esta pesquisa utiliza a teoria dos estágios de crescimento, fundamentada pelos estudos de Richard L. Nolan nos anos 70. A literatura acadêmica relacionada com modelos de estágios de crescimento e o contexto do campo de estudo de SI, fornecem as bases conceituais deste estudo. A pesquisa identifica um modelo com seus construtos relacionados aos estágios de crescimento das iniciativas da TIC/SI organizacional, partindo das variáveis de benchmark de segundo nível de Nolan, e propõe sua operacionalização com a criação e desenvolvimento de uma escala. De caráter exploratório e descritivo, a pesquisa traz contribuição teórica ao paradigma da teoria dos estágios de crescimento, adicionando um novo processo de crescimento em sua estrutura conceitual. Como resultado, é disponibilizado além de um instrumento de escala bilíngue (português e inglês), recomendações e regras para aplicação de um instrumento de pesquisa do tipo survey, na continuidade deste estudo. Como implicação geral desta pesquisa, é esperado que seu uso e aplicação ao mensurar a avaliação do nível de estágio da TIC/SI em organizações, possam auxiliar dois perfis de indivíduos: acadêmicos que estudam essa temática, assim como, profissionais que buscam respostas de suas ações práticas nas organizações onde trabalham.