920 resultados para New Keynesian model, Bayesian methods, Monetary policy, Great Inflation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There have been almost fifty years since Harry Eckstein' s classic monograph, A Theory of Stable Democracy (Princeton, 1961), where he sketched out the basic tenets of the “congruence theory”, which was to become one of the most important and innovative contributions to understanding democratic rule. His next work, Division and Cohesion in Democracy, (Princeton University Press: 1966) is designed to serve as a plausibility probe for this 'theory' (ftn.) and is a case study of a Northern democratic system, Norway. What is more, this line of his work best exemplifies the contribution Eckstein brought to the methodology of comparative politics through his seminal article, “ “Case Study and Theory in Political Science” ” (in Greenstein and Polsby, eds., Handbook of Political Science, 1975), on the importance of the case study as an approach to empirical theory. This article demonstrates the special utility of “crucial case studies” in testing theory, thereby undermining the accepted wisdom in comparative research that the larger the number of cases the better. Although not along the same lines, but shifting the case study unit of research, I intend to take up here the challenge and build upon an equally unique political system, the Swedish one. Bearing in mind the peculiarities of the Swedish political system, my unit of analysis is going to be further restricted to the Swedish Social Democratic Party, the Svenska Arbetare Partiet. However, my research stays within the methodological framework of the case study theory inasmuch as it focuses on a single political system and party. The Swedish SAP endurance in government office and its electoral success throughout half a century (ftn. As of the 1991 election, there were about 56 years - more than half century - of interrupted social democratic "reign" in Sweden.) are undeniably a performance no other Social Democrat party has yet achieved in democratic conditions. Therefore, it is legitimate to inquire about the exceptionality of this unique political power combination. Which were the different components of this dominance power position, which made possible for SAP's governmental office stamina? I will argue here that it was the end-product of a combination of multifarious factors such as a key position in the party system, strong party leadership and organization, a carefully designed strategy regarding class politics and welfare policy. My research is divided into three main parts, the historical incursion, the 'welfare' part and the 'environment' part. The first part is a historical account of the main political events and issues, which are relevant for my case study. Chapter 2 is devoted to the historical events unfolding in the 1920-1960 period: the Saltsjoebaden Agreement, the series of workers' strikes in the 1920s and SAP's inception. It exposes SAP's ascent to power in the mid 1930s and the party's ensuing strategies for winning and keeping political office, that is its economic program and key economic goals. The following chapter - chapter 3 - explores the next period, i.e. the period from 1960s to 1990s and covers the party's troubled political times, its peak and the beginnings of the decline. The 1960s are relevant for SAP's planning of a long term economic strategy - the Rehn Meidner model, a new way of macroeconomic steering, based on the Keynesian model, but adapted to the new economic realities of welfare capitalist societies. The second and third parts of this study develop several hypotheses related to SAP's 'dominant position' (endurance in politics and in office) and test them afterwards. Mainly, the twin issues of economics and environment are raised and their political relevance for the party analyzed. On one hand, globalization and its spillover effects over the Swedish welfare system are important causal factors in explaining the transformative social-economic challenges the party had to put up with. On the other hand, Europeanization and environmental change influenced to a great deal SAP's foreign policy choices and its domestic electoral strategies. The implications of globalization on the Swedish welfare system will make the subject of two chapters - chapters four and five, respectively, whereupon the Europeanization consequences will be treated at length in the third part of this work - chapters six and seven, respectively. Apparently, at first sight, the link between foreign policy and electoral strategy is difficult to prove and uncanny, in the least. However, in the SAP's case there is a bulk of literature and public opinion statistical data able to show that governmental domestic policy and party politics are in a tight dependence to foreign policy decisions and sovereignty issues. Again, these country characteristics and peculiar causal relationships are outlined in the first chapters and explained in the second and third parts. The sixth chapter explores the presupposed relationship between Europeanization and environmental policy, on one hand, and SAP's environmental policy formulation and simultaneous agenda-setting at the international level, on the other hand. This chapter describes Swedish leadership in environmental policy formulation on two simultaneous fronts and across two different time spans. The last chapter, chapter eight - while trying to develop a conclusion, explores the alternative theories plausible in explaining the outlined hypotheses and points out the reasons why these theories do not fit as valid alternative explanation to my systemic corporatism thesis as the main causal factor determining SAP's 'dominant position'. Among the alternative theories, I would consider Traedgaardh L. and Bo Rothstein's historical exceptionalism thesis and the public opinion thesis, which alone are not able to explain the half century social democratic endurance in government in the Swedish case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Small-scale dynamic stochastic general equilibrium have been treated as the benchmark of much of the monetary policy literature, given their ability to explain the impact of monetary policy on output, inflation and financial markets. One cause of the empirical failure of New Keynesian models is partially due to the Rational Expectations (RE) paradigm, which entails a tight structure on the dynamics of the system. Under this hypothesis, the agents are assumed to know the data genereting process. In this paper, we propose the econometric analysis of New Keynesian DSGE models under an alternative expectations generating paradigm, which can be regarded as an intermediate position between rational expectations and learning, nameley an adapted version of the "Quasi-Rational" Expectatations (QRE) hypothesis. Given the agents' statistical model, we build a pseudo-structural form from the baseline system of Euler equations, imposing that the length of the reduced form is the same as in the `best' statistical model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the central bank should adopt a loss function that differs from the social loss function. Carefully designing the central bank s loss function with consistent targets can harmonize optimal and consistent policy. This desirable result emerges from two observations. First, the social loss function reflects a normative process that does not necessarily prove consistent with the structure of the microeconomy. Thus, the social loss function cannot serve as a direct loss function for the central bank. Second, an optimal loss function for the central bank must depend on the structure of that microeconomy. In addition, this paper shows that control theory provides a benchmark for institution design in a game-theoretical framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop a two-sector economy where each sector is classified as classical/Keynesian (contract/noncontract) in the labor market and traded/nontraded in the product market. We consider the effects of changes in monetary and exchange rate policy on sectoral and aggregate prices and outputs for different sectoral characterizations. Duca (1987) shows that nominal wage rigidity facilitates the effectiveness of monetary policy even in the classical sector. We demonstrate that trade price rigidity provides a similar path for the effectiveness of monetary policy, in this case, even when both sectors are classical.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative real-time polymerase chain reaction (qPCR) is a sensitive gene quantitation method that has been widely used in the biological and biomedical fields. The currently used methods for PCR data analysis, including the threshold cycle (CT) method, linear and non-linear model fitting methods, all require subtracting background fluorescence. However, the removal of background fluorescence is usually inaccurate, and therefore can distort results. Here, we propose a new method, the taking-difference linear regression method, to overcome this limitation. Briefly, for each two consecutive PCR cycles, we subtracted the fluorescence in the former cycle from that in the later cycle, transforming the n cycle raw data into n-1 cycle data. Then linear regression was applied to the natural logarithm of the transformed data. Finally, amplification efficiencies and the initial DNA molecular numbers were calculated for each PCR run. To evaluate this new method, we compared it in terms of accuracy and precision with the original linear regression method with three background corrections, being the mean of cycles 1-3, the mean of cycles 3-7, and the minimum. Three criteria, including threshold identification, max R2, and max slope, were employed to search for target data points. Considering that PCR data are time series data, we also applied linear mixed models. Collectively, when the threshold identification criterion was applied and when the linear mixed model was adopted, the taking-difference linear regression method was superior as it gave an accurate estimation of initial DNA amount and a reasonable estimation of PCR amplification efficiencies. When the criteria of max R2 and max slope were used, the original linear regression method gave an accurate estimation of initial DNA amount. Overall, the taking-difference linear regression method avoids the error in subtracting an unknown background and thus it is theoretically more accurate and reliable. This method is easy to perform and the taking-difference strategy can be extended to all current methods for qPCR data analysis.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines whether the IMF high interest rate policy was suitable for crisis-ridden East Asian economies. Using an "overshoot" model similar to that of Dornbusch's (1976), it shows that this sort of policy might cause an unnecessary deflationary adjusting process and have no effect on containing the real depreciation of exchange rates in the long run. The study also demonstrates that Thai economic data coincides quite well with the model presented here. Finally, it points out that the high interest policy itself might provoke high risk-premium, the existence of which, in turn, justifies the policy. This means that the policy has a self-fulfilling property. In conclusion, a "one-size-fits-all" adaptation of high interest rate policy in a currency crisis is very dangerous in general, and was inappropriate for East Asia. The desirable policy would have been to let currencies depreciate and keep interest rates stable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hoy en día, con la evolución continua y rápida de las tecnologías de la información y los dispositivos de computación, se recogen y almacenan continuamente grandes volúmenes de datos en distintos dominios y a través de diversas aplicaciones del mundo real. La extracción de conocimiento útil de una cantidad tan enorme de datos no se puede realizar habitualmente de forma manual, y requiere el uso de técnicas adecuadas de aprendizaje automático y de minería de datos. La clasificación es una de las técnicas más importantes que ha sido aplicada con éxito a varias áreas. En general, la clasificación se compone de dos pasos principales: en primer lugar, aprender un modelo de clasificación o clasificador a partir de un conjunto de datos de entrenamiento, y en segundo lugar, clasificar las nuevas instancias de datos utilizando el clasificador aprendido. La clasificación es supervisada cuando todas las etiquetas están presentes en los datos de entrenamiento (es decir, datos completamente etiquetados), semi-supervisada cuando sólo algunas etiquetas son conocidas (es decir, datos parcialmente etiquetados), y no supervisada cuando todas las etiquetas están ausentes en los datos de entrenamiento (es decir, datos no etiquetados). Además, aparte de esta taxonomía, el problema de clasificación se puede categorizar en unidimensional o multidimensional en función del número de variables clase, una o más, respectivamente; o también puede ser categorizado en estacionario o cambiante con el tiempo en función de las características de los datos y de la tasa de cambio subyacente. A lo largo de esta tesis, tratamos el problema de clasificación desde tres perspectivas diferentes, a saber, clasificación supervisada multidimensional estacionaria, clasificación semisupervisada unidimensional cambiante con el tiempo, y clasificación supervisada multidimensional cambiante con el tiempo. Para llevar a cabo esta tarea, hemos usado básicamente los clasificadores Bayesianos como modelos. La primera contribución, dirigiéndose al problema de clasificación supervisada multidimensional estacionaria, se compone de dos nuevos métodos de aprendizaje de clasificadores Bayesianos multidimensionales a partir de datos estacionarios. Los métodos se proponen desde dos puntos de vista diferentes. El primer método, denominado CB-MBC, se basa en una estrategia de envoltura de selección de variables que es voraz y hacia delante, mientras que el segundo, denominado MB-MBC, es una estrategia de filtrado de variables con una aproximación basada en restricciones y en el manto de Markov. Ambos métodos han sido aplicados a dos problemas reales importantes, a saber, la predicción de los inhibidores de la transcriptasa inversa y de la proteasa para el problema de infección por el virus de la inmunodeficiencia humana tipo 1 (HIV-1), y la predicción del European Quality of Life-5 Dimensions (EQ-5D) a partir de los cuestionarios de la enfermedad de Parkinson con 39 ítems (PDQ-39). El estudio experimental incluye comparaciones de CB-MBC y MB-MBC con los métodos del estado del arte de la clasificación multidimensional, así como con métodos comúnmente utilizados para resolver el problema de predicción de la enfermedad de Parkinson, a saber, la regresión logística multinomial, mínimos cuadrados ordinarios, y mínimas desviaciones absolutas censuradas. En ambas aplicaciones, los resultados han sido prometedores con respecto a la precisión de la clasificación, así como en relación al análisis de las estructuras gráficas que identifican interacciones conocidas y novedosas entre las variables. La segunda contribución, referida al problema de clasificación semi-supervisada unidimensional cambiante con el tiempo, consiste en un método nuevo (CPL-DS) para clasificar flujos de datos parcialmente etiquetados. Los flujos de datos difieren de los conjuntos de datos estacionarios en su proceso de generación muy rápido y en su aspecto de cambio de concepto. Es decir, los conceptos aprendidos y/o la distribución subyacente están probablemente cambiando y evolucionando en el tiempo, lo que hace que el modelo de clasificación actual sea obsoleto y deba ser actualizado. CPL-DS utiliza la divergencia de Kullback-Leibler y el método de bootstrapping para cuantificar y detectar tres tipos posibles de cambio: en las predictoras, en la a posteriori de la clase o en ambas. Después, si se detecta cualquier cambio, un nuevo modelo de clasificación se aprende usando el algoritmo EM; si no, el modelo de clasificación actual se mantiene sin modificaciones. CPL-DS es general, ya que puede ser aplicado a varios modelos de clasificación. Usando dos modelos diferentes, el clasificador naive Bayes y la regresión logística, CPL-DS se ha probado con flujos de datos sintéticos y también se ha aplicado al problema real de la detección de código malware, en el cual los nuevos ficheros recibidos deben ser continuamente clasificados en malware o goodware. Los resultados experimentales muestran que nuestro método es efectivo para la detección de diferentes tipos de cambio a partir de los flujos de datos parcialmente etiquetados y también tiene una buena precisión de la clasificación. Finalmente, la tercera contribución, sobre el problema de clasificación supervisada multidimensional cambiante con el tiempo, consiste en dos métodos adaptativos, a saber, Locally Adpative-MB-MBC (LA-MB-MBC) y Globally Adpative-MB-MBC (GA-MB-MBC). Ambos métodos monitorizan el cambio de concepto a lo largo del tiempo utilizando la log-verosimilitud media como métrica y el test de Page-Hinkley. Luego, si se detecta un cambio de concepto, LA-MB-MBC adapta el actual clasificador Bayesiano multidimensional localmente alrededor de cada nodo cambiado, mientras que GA-MB-MBC aprende un nuevo clasificador Bayesiano multidimensional. El estudio experimental realizado usando flujos de datos sintéticos multidimensionales indica los méritos de los métodos adaptativos propuestos. ABSTRACT Nowadays, with the ongoing and rapid evolution of information technology and computing devices, large volumes of data are continuously collected and stored in different domains and through various real-world applications. Extracting useful knowledge from such a huge amount of data usually cannot be performed manually, and requires the use of adequate machine learning and data mining techniques. Classification is one of the most important techniques that has been successfully applied to several areas. Roughly speaking, classification consists of two main steps: first, learn a classification model or classifier from an available training data, and secondly, classify the new incoming unseen data instances using the learned classifier. Classification is supervised when the whole class values are present in the training data (i.e., fully labeled data), semi-supervised when only some class values are known (i.e., partially labeled data), and unsupervised when the whole class values are missing in the training data (i.e., unlabeled data). In addition, besides this taxonomy, the classification problem can be categorized into uni-dimensional or multi-dimensional depending on the number of class variables, one or more, respectively; or can be also categorized into stationary or streaming depending on the characteristics of the data and the rate of change underlying it. Through this thesis, we deal with the classification problem under three different settings, namely, supervised multi-dimensional stationary classification, semi-supervised unidimensional streaming classification, and supervised multi-dimensional streaming classification. To accomplish this task, we basically used Bayesian network classifiers as models. The first contribution, addressing the supervised multi-dimensional stationary classification problem, consists of two new methods for learning multi-dimensional Bayesian network classifiers from stationary data. They are proposed from two different points of view. The first method, named CB-MBC, is based on a wrapper greedy forward selection approach, while the second one, named MB-MBC, is a filter constraint-based approach based on Markov blankets. Both methods are applied to two important real-world problems, namely, the prediction of the human immunodeficiency virus type 1 (HIV-1) reverse transcriptase and protease inhibitors, and the prediction of the European Quality of Life-5 Dimensions (EQ-5D) from 39-item Parkinson’s Disease Questionnaire (PDQ-39). The experimental study includes comparisons of CB-MBC and MB-MBC against state-of-the-art multi-dimensional classification methods, as well as against commonly used methods for solving the Parkinson’s disease prediction problem, namely, multinomial logistic regression, ordinary least squares, and censored least absolute deviations. For both considered case studies, results are promising in terms of classification accuracy as well as regarding the analysis of the learned MBC graphical structures identifying known and novel interactions among variables. The second contribution, addressing the semi-supervised uni-dimensional streaming classification problem, consists of a novel method (CPL-DS) for classifying partially labeled data streams. Data streams differ from the stationary data sets by their highly rapid generation process and their concept-drifting aspect. That is, the learned concepts and/or the underlying distribution are likely changing and evolving over time, which makes the current classification model out-of-date requiring to be updated. CPL-DS uses the Kullback-Leibler divergence and bootstrapping method to quantify and detect three possible kinds of drift: feature, conditional or dual. Then, if any occurs, a new classification model is learned using the expectation-maximization algorithm; otherwise, the current classification model is kept unchanged. CPL-DS is general as it can be applied to several classification models. Using two different models, namely, naive Bayes classifier and logistic regression, CPL-DS is tested with synthetic data streams and applied to the real-world problem of malware detection, where the new received files should be continuously classified into malware or goodware. Experimental results show that our approach is effective for detecting different kinds of drift from partially labeled data streams, as well as having a good classification performance. Finally, the third contribution, addressing the supervised multi-dimensional streaming classification problem, consists of two adaptive methods, namely, Locally Adaptive-MB-MBC (LA-MB-MBC) and Globally Adaptive-MB-MBC (GA-MB-MBC). Both methods monitor the concept drift over time using the average log-likelihood score and the Page-Hinkley test. Then, if a drift is detected, LA-MB-MBC adapts the current multi-dimensional Bayesian network classifier locally around each changed node, whereas GA-MB-MBC learns a new multi-dimensional Bayesian network classifier from scratch. Experimental study carried out using synthetic multi-dimensional data streams shows the merits of both proposed adaptive methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the past 20 years,theuse of Computer Algebra Systems(CAS) has helped with the teaching of mathematics inengineer-ingschools. However the traditional use of CAS only in math labs has led to a narrow view by the student: the CAS is an additional work, not included in the learning process. The didactic guidelines of the European Higher Education Area(EHEA) propose a new teaching–learning model based on competencies. We suggest the use of the CAS be adapted to the new rules. In this paper,we present a model for the integrated use of the CAS,and we describe and analyze two experiments carried out in the academic year2011–2012. Our analysis suggests that the use of CAS in all learning and assessment activities has the potential to positively influence the development of competencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En esta Tesis Doctoral se emplean y desarrollan Métodos Bayesianos para su aplicación en análisis geotécnicos habituales, con un énfasis particular en (i) la valoración y selección de modelos geotécnicos basados en correlaciones empíricas; en (ii) el desarrollo de predicciones acerca de los resultados esperados en modelos geotécnicos complejos. Se llevan a cabo diferentes aplicaciones a problemas geotécnicos, como es el caso de: (1) En el caso de rocas intactas, se presenta un método Bayesiano para la evaluación de modelos que permiten estimar el módulo de Young a partir de la resistencia a compresión simple (UCS). La metodología desarrollada suministra estimaciones de las incertidumbres de los parámetros y predicciones y es capaz de diferenciar entre las diferentes fuentes de error. Se desarrollan modelos "específicos de roca" para los tipos de roca más comunes y se muestra cómo se pueden "actualizar" esos modelos "iniciales" para incorporar, cuando se encuentra disponible, la nueva información específica del proyecto, reduciendo las incertidumbres del modelo y mejorando sus capacidades predictivas. (2) Para macizos rocosos, se presenta una metodología, fundamentada en un criterio de selección de modelos, que permite determinar el modelo más apropiado, entre un conjunto de candidatos, para estimar el módulo de deformación de un macizo rocoso a partir de un conjunto de datos observados. Una vez que se ha seleccionado el modelo más apropiado, se emplea un método Bayesiano para obtener distribuciones predictivas de los módulos de deformación de macizos rocosos y para actualizarlos con la nueva información específica del proyecto. Este método Bayesiano de actualización puede reducir significativamente la incertidumbre asociada a la predicción, y por lo tanto, afectar las estimaciones que se hagan de la probabilidad de fallo, lo cual es de un interés significativo para los diseños de mecánica de rocas basados en fiabilidad. (3) En las primeras etapas de los diseños de mecánica de rocas, la información acerca de los parámetros geomecánicos y geométricos, las tensiones in-situ o los parámetros de sostenimiento, es, a menudo, escasa o incompleta. Esto plantea dificultades para aplicar las correlaciones empíricas tradicionales que no pueden trabajar con información incompleta para realizar predicciones. Por lo tanto, se propone la utilización de una Red Bayesiana para trabajar con información incompleta y, en particular, se desarrolla un clasificador Naïve Bayes para predecir la probabilidad de ocurrencia de grandes deformaciones (squeezing) en un túnel a partir de cinco parámetros de entrada habitualmente disponibles, al menos parcialmente, en la etapa de diseño. This dissertation employs and develops Bayesian methods to be used in typical geotechnical analyses, with a particular emphasis on (i) the assessment and selection of geotechnical models based on empirical correlations; on (ii) the development of probabilistic predictions of outcomes expected for complex geotechnical models. Examples of application to geotechnical problems are developed, as follows: (1) For intact rocks, we present a Bayesian framework for model assessment to estimate the Young’s moduli based on their UCS. Our approach provides uncertainty estimates of parameters and predictions, and can differentiate among the sources of error. We develop ‘rock-specific’ models for common rock types, and illustrate that such ‘initial’ models can be ‘updated’ to incorporate new project-specific information as it becomes available, reducing model uncertainties and improving their predictive capabilities. (2) For rock masses, we present an approach, based on model selection criteria to select the most appropriate model, among a set of candidate models, to estimate the deformation modulus of a rock mass, given a set of observed data. Once the most appropriate model is selected, a Bayesian framework is employed to develop predictive distributions of the deformation moduli of rock masses, and to update them with new project-specific data. Such Bayesian updating approach can significantly reduce the associated predictive uncertainty, and therefore, affect our computed estimates of probability of failure, which is of significant interest to reliability-based rock engineering design. (3) In the preliminary design stage of rock engineering, the information about geomechanical and geometrical parameters, in situ stress or support parameters is often scarce or incomplete. This poses difficulties in applying traditional empirical correlations that cannot deal with incomplete data to make predictions. Therefore, we propose the use of Bayesian Networks to deal with incomplete data and, in particular, a Naïve Bayes classifier is developed to predict the probability of occurrence of tunnel squeezing based on five input parameters that are commonly available, at least partially, at design stages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta dissertação visa deslumbrar uma análise macroeconômica do Brasil, especialmente no que se refere à relação dos índices mensais dos volumes das exportações e das importações com os volumes mensais do PIB, da Taxa SELIC e as Taxas de Câmbio, conforme dados coletados no período de janeiro de 2004 a dezembro de 2014, através de pesquisa literária referente aos históricos sobre cada conceito envolvido no âmbito da macroeconomia das varáveis estudadas. Foi realizado um estudo de caso embasado em dados de sites governamentais, no período delimitado, empregando-se o método de regressão linear, com base na Teoria da correlação de Pearson, demonstrando os resultados obtidos no período do estudo para as varáveis estudadas. Desta maneira, conseguiu-se estudar e analisar como as variáveis dependentes (resposta): volume das exportações e volume das importações estão relacionadas com as varáveis independentes (explicativas): PIB, Taxa Selic e taxa de Câmbio. Os resultados apurados no presente estudo permitem identificar que existe correlação moderada e negativa, quando analisadas a Taxa Selic e a Taxa de Câmbio com os volumes das exportações e das importações, enquanto o PIB apresenta correlação forte e positiva na análise com os volumes das exportações e das importações

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Variations in the physical deformation of the plasma membrane play a significant role in the sorting and behavior of the proteins that occupy it. Determining the interplay between membrane curvature and protein behavior required the development and thorough characterization of a model plasma membrane with well defined and localized regions of curvature. This model system consists of a fluid lipid bilayer that is supported by a dye-loaded polystyrene nanoparticle patterned glass substrate. As the physical deformation of the supported lipid bilayer is essential to our understanding of the behavior of the protein occupying the bilayer, extensive characterization of the structure of the model plasma membrane was conducted. Neither the regions of curvature in the vicinity of the polystyrene nanoparticles or the interaction between a lipid bilayer and small patches of curved polystyrene are well understood, so the results of experiments to determine these properties are described. To do so, individual fluorescently labeled proteins and lipids are tracked on this model system and in live cells. New methods for analyzing the resulting tracks and ensemble data are presented and discussed. To validate the model system and analytical methods, fluorescence microscopy was used to image a peripheral membrane protein, cholera toxin subunit B (CTB). These results are compared to results obtained from membrane components that were not expected to show an preference for membrane curvature: an individual fluorescently-labeled lipid, lissamine rhodamine B DHPE, and another protein, streptavidin associated with biotin-labeled DHPE. The observed tendency for cholera toxin subunit B to avoid curved regions of curvature, as determined by new and established analytical methods, is presented and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the Great Recession, central banks went well beyond their normal operations and provided liquidity in unlimited amounts, in foreign currency and to foreign banks. Central bank cooperation took the form of a swap network, and amounted to an episode of global monetary policy. However, though bank cooperation will continue to contribute to global governance, the swap network should not be made permanent and given an institutional basis to provide international lending of last resort. Swaps are a monetary policy tool and should continue to be decided on by central banks like all other monetary policy tools,to avoid impinging on their independence, which a difficult historical process has shown to be the best basis for price stability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper first takes a step backwards with an attempt to situate the recent adoption of the Treaty on Stability, Coordination and Governance in the Economic and Monetary Union in the context of discussions on the Stability and Growth Pact (SGP) and the ‘Maastricht criteria’, as fixed in the Maastricht Treaty for membership in the Economic and Monetary Union (EMU) in a longer perspective of the sharing of competences for macroeconomic policy-making within the EU. It then presents the main features of the new so-called ‘Fiscal Compact’ and its relationship to the SGP and draws some conclusions as regards the importance and relevance of this new step in the process of economic policy coordination. It concludes that the Treaty on Stability, Coordination and Governance in the Economic and Monetary Union does not seem to offer a definitive solution to the problem of finding the appropriate budgetary-monetary policy mix in EMU, which was already well identified in the Delors report in 1989 and regularly emphasised ever since and is now seriously aggravated due to the crisis in the eurozone. Furthermore, implementation of this Treaty may under certain circumstances contribute to an increase in the uncertainties as regards the distribution of the competences between the European Parliament and national parliaments and between the former and the Commission and the Council.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European market for asset-backed securities (ABS) has all but closed for business since the start of the economic and financial crisis. ABS (see Box 1) were in fact the first financial assets hit at the onset of the crisis in 2008. The subprime mortgage meltdown caused a deterioration in the quality of collateral in the ABS market in the United States, which in turn dried up overall liquidity because ABS AAA notes were popular collateral for inter-bank lending. The lack of demand for these products, together with the Great Recession in 2009, had a considerable negative impact on the European ABS market. The post-crisis regulatory environment has further undermined the market. The practice of slicing and dicing of loans into ABS packages was blamed for starting and spreading the crisis through the global financial system. Regulation in the post-crisis context has thus been relatively unfavourable to these types of instruments, with heightened capital requirements now necessary for the issuance of new ABS products. And yet policymakers have recently underlined the need to revitalise the ABS market as a tool to improve credit market conditions in the euro area and to enhance transmission of monetary policy. In particular, the European Central Bank and the Bank of England have jointly emphasised that: “a market for prudently designed ABS has the potential to improve the efficiency of resource allocation in the economy and to allow for better risk sharing... by transforming relatively illiquid assets into more liquid securities. These can then be sold to investors thereby allowing originators to obtain funding and, potentially, transfer part of the underlying risk, while investors in such securities can diversify their portfolios... . This can lead to lower costs of capital, higher economic growth and a broader distribution of risk” (ECB and Bank of England, 2014a). In addition, consideration has started to be given to the extent to which ABS products could become the target of explicit monetary policy operations, a line of action proposed by Claeys et al (2014). The ECB has officially announced the start of preparatory work related to possible outright purchases of selected ABS1. In this paper we discuss how a revamped market for corporate loans securitised via ABS products, and how use of ABS as a monetary policy instrument, can indeed play a role in revitalising Europe’s credit market. However, before using this instrument a number of issues should be addressed: First, the European ABS market has significantly contracted since the crisis. Hence it needs to be revamped through appropriate regulation if securitisation is to play a role in improving the efficiency of resource allocation in the economy. Second, even assuming that this market can expand again, the European ABS market is heterogeneous: lending criteria are different in different countries and banking institutions and the rating methodologies to assess the quality of the borrowers have to take these differences into account. One further element of differentiation is default law, which is specific to national jurisdictions in the euro area. Therefore, the pool of loans will not only be different in terms of the macro risks related to each country of origination (which is a ‘positive’ idiosyncratic risk, because it enables a portfolio manager to differentiate), but also in terms of the normative side, in case of default. The latter introduces uncertainties and inefficiencies in the ABS market that could create arbitrage opportunities. It is also unclear to what extent a direct purchase of these securities by the ECB might have an impact on the credit market. This will depend on, for example, the type of securities targeted in terms of the underlying assets that would be considered as eligible for inclusion (such as loans to small and medium-sized companies, car loans, leases, residential and commercial mortgages). The timing of a possible move by the ECB is also an issue; immediate action would take place in the context of relatively limited market volumes, while if the ECB waits, it might have access to a larger market, provided steps are taken in the next few months to revamp the market. We start by discussing the first of these issues – the size of the EU ABS market. We estimate how much this market could be worth if some specific measures are implemented. We then discuss the different options available to the ECB should they decide to intervene in the EU ABS market. We include a preliminary list of regulatory steps that could be taken to homogenise asset-backed securities in the euro area. We conclude with our recommended course of action.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08