832 resultados para Guarantees
Resumo:
Highlights • Government intervention to stabilise financial systems in times of banking crises ultimately involves political decisions. This paper sheds light on how certain political variables influence policy choices during banking crises and hence have an impact on fiscal outlays. • We employ cross-country econometric evidence from all crisis episodes in the period 1970-2011 to examine the impact political and party systems have on the fiscal cost of financial sector intervention. • Governments in presidential systems are associated with lower fiscal costs of crisis management because they are less likely to use costly bank guarantees, thus reducing the exposure of the state to significant contingent and direct fiscal liabilities. Consistent with these findings we find further evidence that these governments are less likely to use bank recapitalisation and more likely to impose losses on depositors.
Resumo:
One of the key challenges that Ukraine is facing is the scale of its foreign debt (both public and private). As of 1st April it stood at US$ 126 billion, which is 109.8% of the country’s GDP. Approximately 45% of these financial obligations are short-term, meaning that they must be paid off within a year. Although the value of the debt has fallen by nearly US$ 10 billion since the end of 2014 (due to the private sector paying a part of the liabilities), the debt to GDP ratio has increased due to the recession and the depreciation of the hryvnia. The value of Ukraine’s foreign public debt is also on the rise (including state guarantees); since the beginning of 2015 it has risen from US$ 37.6 billion to US$ 43.6 billion. Ukraine does not currently have the resources to pay off its debt. In this situation a debt restructuring is necessary and this is one of the top priorities for the Ukrainian government as well as for the International Monetary Fund (IMF) and its assistance programme. Without this it will be much more difficult for Ukraine to overcome the economic crisis.
Resumo:
In its recent Schrems judgment the Luxembourg Court annulled Commission Decision 2000/520 according to which US data protection rules are sufficient to satisfy EU privacy rules regarding EU-US transfers of personal data, otherwise known as the ‘Safe Harbour’ framework. What does this judgment mean and what are its main implications for EU-US data transfers? In this paper the authors find that this landmark judgment sends a strong message to EU and US policy-makers about the need to ensure clear rules governing data transfers, so that people whose personal data is transferred to third countries have sufficient legal guarantees. Without such rules there is legal uncertainty and mistrust. Any future arrangement for the transatlantic transfer of data will therefore need to be firmly anchored in a framework of protection commensurate to the EU Charter of Fundamental Rights and the EU data protection architecture.
Resumo:
In the military dimension, the Four-Day War in Nagorno-Karabakh (2–5 April 2016) changed little in the conflict zone. It has, however, had a significant impact on the situation in Armenia. The country was shocked out of the political malaise that had been the dominant mood in the last few years, and the Karabakh question, which used to animate political life in the late 1980s and early 1990s, once again became a driving force behind developments. In the internal dimension, the renewed fighting galvanised the political scene, triggered a rise in nationalist sentiments, mobilised the public and consolidated it around the Karabakh question, overshadowing the frustrations caused by the country’s difficult economic situation. In the external dimension, the war, which was viewed as Moscow-endorsed Azerbaijani aggression, undermined people’s trust in Russia and the Armenian-Russian alliance. It also made it clear for Armenians how uncertain the Russian security guarantees were and exacerbated their feelings of vulnerability and isolation on the international stage.
Resumo:
The research work presented in the thesis describes a new methodology for the automated near real-time detection of pipe bursts in Water Distribution Systems (WDSs). The methodology analyses the pressure/flow data gathered by means of SCADA systems in order to extract useful informations that go beyond the simple and usual monitoring type activities and/or regulatory reporting , enabling the water company to proactively manage the WDSs sections. The work has an interdisciplinary nature covering AI techniques and WDSs management processes such as data collection, manipulation and analysis for event detection. Indeed, the methodology makes use of (i) Artificial Neural Network (ANN) for the short-term forecasting of future pressure/flow signal values and (ii) Rule-based Model for bursts detection at sensor and district level. The results of applying the new methodology to a District Metered Area in Emilia- Romagna’s region, Italy have also been reported in the thesis. The results gathered illustrate how the methodology is capable to detect the aforementioned failure events in fast and reliable manner. The methodology guarantees the water companies to save water, energy, money and therefore enhance them to achieve higher levels of operational efficiency, a compliance with the current regulations and, last but not least, an improvement of customer service.
Resumo:
The synthetic control (SC) method has been recently proposed as an alternative to estimate treatment effects in comparative case studies. The SC relies on the assumption that there is a weighted average of the control units that reconstruct the potential outcome of the treated unit in the absence of treatment. If these weights were known, then one could estimate the counterfactual for the treated unit using this weighted average. With these weights, the SC would provide an unbiased estimator for the treatment effect even if selection into treatment is correlated with the unobserved heterogeneity. In this paper, we revisit the SC method in a linear factor model where the SC weights are considered nuisance parameters that are estimated to construct the SC estimator. We show that, when the number of control units is fixed, the estimated SC weights will generally not converge to the weights that reconstruct the factor loadings of the treated unit, even when the number of pre-intervention periods goes to infinity. As a consequence, the SC estimator will be asymptotically biased if treatment assignment is correlated with the unobserved heterogeneity. The asymptotic bias only vanishes when the variance of the idiosyncratic error goes to zero. We suggest a slight modification in the SC method that guarantees that the SC estimator is asymptotically unbiased and has a lower asymptotic variance than the difference-in-differences (DID) estimator when the DID identification assumption is satisfied. If the DID assumption is not satisfied, then both estimators would be asymptotically biased, and it would not be possible to rank them in terms of their asymptotic bias.
Resumo:
Esta tese analisa, entre 2005 e 2013, o impacto das políticas governamentais de resgate sobre o risco do setor bancário nos países da OCDE. Primeiro, em linha com a hipótese de moral hazard, verifica-se que instituições financeiras com expectativa elevada de bailout, assumem riscos mais elevados do que as demais. Segundo, constata-se que, em períodos normais, garantias de socorro às grandes instituições distorcem a competição no setor e incrementa o risco das demais. Durante a crise, entretanto, mostra-se que elevações na expectativa de resgate dos concorrentes de uma instituição, à medida que representa uma redução em sua chance de eventual socorro governamental, diminuem sua tomada de riscos. Adicionalmente, em período de crise também é evidenciado que: reduções na capacidade financeira dos países estão associadas a menor assunção de riscos; em média, o aumento na tomada de riscos é maior nos países com menor spread de Credit Default Swap.
Resumo:
Analisamos os determinantes de precificação de Certificados de Recebíveis Imobiliários (CRIs) com relação ao ativo objeto e níveis de garantias, controlando por variáveis de tamanho, prazo e rating. Verifica-se um prêmio médio adicional em CRIs de 1,0 p.p. quando comparados com debêntures de prazos semelhantes e de mesmo rating. A justificativa desse prêmio é analisada em duas frentes: (a) apesar de CRI seguir relativa padronização, encontramos que o papel pode representar diferentes níveis de risco e ativos-objeto; e (b) essa falta de padronização leva a níveis de precificação diferenciados por suas características específicas de riscos. Os diferentes níveis de risco são percebidos pelas diversas garantias utilizadas sendo que 41% das emissões possuem garantias pessoais de originadores (aval ou fiança). Conclui-se que existe, em geral, uma diferença de retornos positiva (o spread médio na emissão dos CRIs indexados à inflação foi de 321 bps superior à curva de juros de mercado), sendo mais preponderante a depender do segmento (prêmio para os segmentos residencial e loteamentos) e mitigado pelo nível de garantias oferecido. É possível verificar um prêmio médio de 1,4 p.p. para os segmentos residencial e de loteamentos. Algumas características das emissões foram analisadas como controle (tamanho, prazo e, por fim, das notas e origem da agência avaliadora de rating). Os CRIs de maior volume e maior prazo apresentam spreads menores. Quanto ao rating, os CRIs apresentam efeitos diversos a depender do segmento. Para CRIs residenciais, o efeito é positivo (redução de spread) caso a emissão seja avaliada por alguma agência de rating, enquanto que para os CRIs comerciais, o efeito é negativo. O efeito pode ser positivo para os CRIs comerciais (redução de spread) em caso de avaliação por agência de rating internacional ou possuir notas de rating superiores à nota ‘A’.
Resumo:
Vols.2-3 lack series note.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
A rapid increase in the number and size of protected areas has prompted interest in their effectiveness and calls for guarantees that they are providing a good return on investment by maintaining their values. Research reviewed here suggests that many remain under threat and a significant number are already suffering deterioration. One suggestion for encouraging good management is to develop a protected-area certification system: however this idea remains controversial and has created intense debate. We list a typology of options for guaranteeing good protected-area management, and give examples, including: danger lists; self-reporting systems against individual or standardised criteria; and independent assessment including standardised third-party reporting, use of existing certification systems such as those for forestry and farming and certification tailored specifically to protected areas. We review the arguments for and against certification and identify some options, such as: development of an accreditation scheme to ensure that assessment systems meet minimum standards; building up experience from projects that are experimenting with certification in protected areas; and initiating certification schemes for specific users such as private protected areas or institutions like the World Heritage Convention.
Resumo:
Processor emulators are a software tool for allowing legacy computer programs to be executed on a modern processor. In the past emulators have been used in trivial applications such as maintenance of video games. Now, however, processor emulation is being applied to safety-critical control systems, including military avionics. These applications demand utmost guarantees of correctness, but no verification techniques exist for proving that an emulated system preserves the original system’s functional and timing properties. Here we show how this can be done by combining concepts previously used for reasoning about real-time program compilation, coupled with an understanding of the new and old software architectures. In particular, we show how both the old and new systems can be given a common semantics, thus allowing their behaviours to be compared directly.
Resumo:
Pattern discovery in a long temporal event sequence is of great importance in many application domains. Most of the previous work focuses on identifying positive associations among time stamped event types. In this paper, we introduce the problem of defining and discovering negative associations that, as positive rules, may also serve as a source of knowledge discovery. In general, an event-oriented pattern is a pattern that associates with a selected type of event, called a target event. As a counter-part of previous research, we identify patterns that have a negative relationship with the target events. A set of criteria is defined to evaluate the interestingness of patterns associated with such negative relationships. In the process of counting the frequency of a pattern, we propose a new approach, called unique minimal occurrence, which guarantees that the Apriori property holds for all patterns in a long sequence. Based on the interestingness measures, algorithms are proposed to discover potentially interesting patterns for this negative rule problem. Finally, the experiment is made for a real application.
Resumo:
The real-time refinement calculus is an extension of the standard refinement calculus in which programs are developed from a precondition plus post-condition style of specification. In addition to adapting standard refinement rules to be valid in the real-time context, specific rules are required for the timing constructs such as delays and deadlines. Because many real-time programs may be nonterminating, a further extension is to allow nonterminating repetitions. A real-time specification constrains not only what values should be output, but when they should be output. Hence for a program to implement such a specification, it must guarantee to output values by the specified times. With standard programming languages such guarantees cannot be made without taking into account the timing characteristics of the implementation of the program on a particular machine. To avoid having to consider such details during the refinement process, we have extended our real-time programming language with a deadline command. The deadline command takes no time to execute and always guarantees to meet the specified time; if the deadline has already passed the deadline command is infeasible (miraculous in Dijkstra's terminology). When such a realtime program is compiled for a particular machine, one needs to ensure that all execution paths leading to a deadline are guaranteed to reach it by the specified time. We consider this checking as part of an extended compilation phase. The addition of the deadline command restores for the real-time language the advantage of machine independence enjoyed by non-real-time programming languages.