991 resultados para Costly state verification
Resumo:
This paper studies the efficiency of equilibria in a productive OLG economy where the process of financial intermediation is characterized by costly state verification. Both competitive equilibria and Constrained Pareto Optimal allocations are characterized. It is shown that market outcomes can be socially inefficient, even when a weaker notion than Pareto optimality is considered.
Resumo:
We show how the prospect of disputes over firms’ revenue reports promotes debt financing over equity. These findings are presented within a costly state verification model with a risk averse entrepreneur. The prospect of disputes encourages incentive regimes which limit penalties and avoid stochastic monitoring, even when the lender can commit to stochastic enforcement strategies. Consequently, optimal contracts shift away from equity and toward standard debt. For a useful special case of the model, closed form solutions are presented for leverage and consumption allocations under efficient debt contracts.
Resumo:
Cette thèse examine les effets des imperfections des marchés financiers sur la macroéconomie. Plus particulièrement, elle se penche sur les conséquences de la faillite dans les contrats financiers dans une perspective d'équilibre général dynamique. Le premier papier construit un modèle qui utilise l'avantage comparatif des banques dans la gestion des situations de détresse financière pour expliquer le choix des firmes entre les prêts bancaires et les prêts du marché financier. Le modèle réussit à expliquer pourquoi les firmes plus petites préfèrent le financement bancaire et pourquoi les prêts bancaires sont plus répandus en Europe. Le premier fait est expliqué par le lien négatif entre la valeur nette de l'entreprise et la probabilité de faire faillite. Le deuxième fait s'explique par le coût fixe d'émission de bons plus élevé en Europe. Le deuxième papier examine l'interaction entre les contraintes de financement affectant les ménages et les firmes. Une interaction positive pourrait amplifier et augmenter la persistance de l'effet d'un choc agrégé sur l'économie. Je construis un nouveau modèle qui contient des primes de financement externes pour les firmes et les ménages. Dans le modèle de base avec prix et salaires flexibles, j'obtiens une faible interaction négative entre les coûts de financement des firmes et des ménages. Le facteur clé qui explique ce résultat est l'effet du changement contre cyclique du coût de financement des ménages sur leur offre de travail et leur demande de prêts. Dans une période d'expansion, cet effet augmente les taux d'intérêt, réduit l'investissement et augmente le coût de financement des entreprises. Le troisième papier ajoute les contraintes de financement des banques dans un modèle macroéconomiques avec des prêts hypothécaires et des fluctuations dans les prix de l'immobilier. Les banques dans le modèle ne peuvent pas complètement diversifier leurs prêts, ce qui génère un lien entre les risques de faillite des ménages et des banques. Il y a deux effets contraires des cycles économiques qui affectent la prime de financement externe de la banque. Premièrement, il y a un lien positif entre le risque de faillite des banques et des emprunteurs qui contribue à rendre le coût de financement externe des banques contre cyclique. Deuxiément, le lissage de la consommation par les ménages rend la proportion de financement externe des banques pro cyclique, ce qui tend à rendre le coût de financement bancaire pro cyclique. En combinant ces deux effets, le modèle peut reproduire des profits bancaires et des ratios d'endettement bancaires pro cycliques comme dans les données, mais pour des chocs non-financiers les frictions de financement bancaire dans le modèle n'ont pas un effet quantitativement significatif sur les principales variables agrégées comme la consommation ou l'investissement.
Resumo:
El objetivo de este trabajo es utilizar algunos hechos estilizados de la "Gran recesión", específicamente la drástica caída en el nivel de capitalización bancario, para analizar la relación entre los ciclos financieros y los ciclos reales, así como la efectividad de la política monetaria no convencional y las políticas macroprudenciales. Para esto, en el primer capítulo se desarrolla una microfundamentación de la banca a partir de un modelo de Costly State Verification, que es incluido posteriomente en distintas especificaciones de modelos DSGE. Los resultados muestran que: (i) los ciclos financieros y los ciclos económicos pueden relacionarse a partir del deterioro del capital bancario; (ii) Las políticas macroprudenciales y no convencionales son efectivas para moderar los ciclos económicos, pero son costosas en términos de recursos e inflación.
Resumo:
La aparición de la fatiga ha sido ampliamente investigada en el acero y en otros materiales metálicos, sin embargo no se conoce en tanta profundidad en el hormigón estructural. Esto crea falta de uniformidad y enfoque en el proceso de verificación de estructuras de hormigón para el estado límite último de la fatiga. A medida que se llevan a cabo más investigaciones, la información sobre los parámetros que afectan a la fatiga en el hormigón comienzan a ser difundidos e incluso los que les afectan de forma indirecta. Esto conlleva a que se estén incorporando en las guías de diseño de todo el mundo, a pesar de que la comprobación del estado límite último no se trata por igual entre los distintos órganos de diseño. Este trabajo presentará un conocimiento básico del fenómeno de la fatiga, qué lo causa y qué condiciones de carga o propiedades materiales amplían o reducen la probabilidad de fallo por fatiga. Cuatro distintos códigos de diseño serán expuestos y su proceso de verificación ha sido examinado, comparados y valorados cualitativa y cuantitativamente. Una torre eólica, como ejemplo, fue analizada usando los procedimientos de verificación como se indica en sus respectivos códigos de referencia. The occurrence of fatigue has been extensively researched in steel and other metallic materials it is however, not as broadly understood in concrete. This produces a lack of uniformity in the approach and process in the verification of concrete structures for the ultimate limit state of fatigue. As more research is conducted and more information is known about the parameters which cause, propagate, and indirectly affect fatigue in concrete, they are incorporated in design guides around the world. Nevertheless, this ultimate limit state verification is not addressed equally by various design governing bodies. This report presents a baseline understanding of what the phenomenon of fatigue is, what causes it, and what loading or material conditions amplify or reduce the likelihood of fatigue failure. Four different design codes are exposed and their verification process has been examined, compared and evaluated both qualitatively and quantitatively. Using a wind turbine tower structure as case study, this report presents calculated results following the verification processes as instructed in the respective reference codes.
Resumo:
The current means and methods of verifying that high-strength bolts have been properly tightened are very laborious and time consuming. In some cases, the techniques require special equipment and, in other cases, the verification itself may be somewhat subjective. While some commercially available verification techniques do exist, these options still have some limitations and might be considered costly options. The main objectives of this project were to explore high-strength bolt-tightening and verification techniques and to investigate the feasibility of developing and implementing new alternatives. A literature search and a survey of state departments of transportation (DOTs) were conducted to collect information on various bolt-tightening techniques such that an understanding of available and under-development techniques could be obtained. During the literature review, the requirements for materials, inspection, and installation methods outlined in the Research Council on Structural Connections specification were also reviewed and summarized. To guide the search for finding new alternatives and technology development, a working group meeting was held at the Iowa State University Institute for Transportation October 12, 2015. During the meeting, topics central to the research were discussed with Iowa DOT engineers and other professionals who have relevant experiences.
Resumo:
National Highway Traffic Safety Administration, Washington, D.C.
Resumo:
Research in verification and validation (V&V) for concurrent programs can be guided by practitioner information. A survey was therefore run to gain state-of-practice information in this context. The survey presented in this paper collected state-of-practice information on V&V technology in concurrency from 35 respondents. The results of the survey can help refine existing V&V technology by providing a better understanding of the context of V&V technology usage. Responses to questions regarding the motivation for selecting V&V technologies can help refine a systematic approach to V&V technology selection.
Resumo:
This paper presents a new methodology to estimate unbalanced harmonic distortions in a power system, based on measurements of a limited number of given sites. The algorithm utilizes evolutionary strategies (ES), a development branch of evolutionary algorithms. The problem solving algorithm herein proposed makes use of data from various power quality meters, which can either be synchronized by high technology GPS devices or by using information from a fundamental frequency load flow, what makes the overall power quality monitoring system much less costly. The ES based harmonic estimation model is applied to a 14 bus network to compare its performance to a conventional Monte Carlo approach. It is also applied to a 50 bus subtransmission network in order to compare the three-phase and single-phase approaches as well as the robustness of the proposed method. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
This paper presents a new methodology to estimate harmonic distortions in a power system, based on measurements of a limited number of given sites. The algorithm utilizes evolutionary strategies (ES), a development branch of evolutionary algorithms. The main advantage in using such a technique relies upon its modeling facilities as well as its potential to solve fairly complex problems. The problem-solving algorithm herein proposed makes use of data from various power-quality (PQ) meters, which can either be synchronized by high technology global positioning system devices or by using information from a fundamental frequency load flow. This second approach makes the overall PQ monitoring system much less costly. The algorithm is applied to an IEEE test network, for which sensitivity analysis is performed to determine how the parameters of the ES can be selected so that the algorithm performs in an effective way. Case studies show fairly promising results and the robustness of the proposed method.
Resumo:
Modern Integrated Circuit (IC) design is characterized by a strong trend of Intellectual Property (IP) core integration into complex system-on-chip (SOC) architectures. These cores require thorough verification of their functionality to avoid erroneous behavior in the final device. Formal verification methods are capable of detecting any design bug. However, due to state explosion, their use remains limited to small circuits. Alternatively, simulation-based verification can explore hardware descriptions of any size, although the corresponding stimulus generation, as well as functional coverage definition, must be carefully planned to guarantee its efficacy. In general, static input space optimization methodologies have shown better efficiency and results than, for instance, Coverage Directed Verification (CDV) techniques, although they act on different facets of the monitored system and are not exclusive. This work presents a constrained-random simulation-based functional verification methodology where, on the basis of the Parameter Domains (PD) formalism, irrelevant and invalid test case scenarios are removed from the input space. To this purpose, a tool to automatically generate PD-based stimuli sources was developed. Additionally, we have developed a second tool to generate functional coverage models that fit exactly to the PD-based input space. Both the input stimuli and coverage model enhancements, resulted in a notable testbench efficiency increase, if compared to testbenches with traditional stimulation and coverage scenarios: 22% simulation time reduction when generating stimuli with our PD-based stimuli sources (still with a conventional coverage model), and 56% simulation time reduction when combining our stimuli sources with their corresponding, automatically generated, coverage models.
Resumo:
Using spontaneous parametric down-conversion, we produce polarization-entangled states of two photons and characterize them using two-photon tomography to measure the density matrix. A controllable decoherence is imposed on the states by passing the photons through thick, adjustable birefringent elements. When the system is subject to collective decoherence, one particular entangled state is seen to be decoherence-free, as predicted by theory. Such decoherence-free systems may have an important role for the future of quantum computation and information processing.