920 resultados para New Keynesian model, Bayesian methods, Monetary policy, Great Inflation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Companies are increasingly more and more dependent on distributed web-based software systems to support their businesses. This increases the need to maintain and extend software systems with up-to-date new features. Thus, the development process to introduce new features usually needs to be swift and agile, and the supporting software evolution process needs to be safe, fast, and efficient. However, this is usually a difficult and challenging task for a developer due to the lack of support offered by programming environments, frameworks, and database management systems. Changes needed at the code level, database model, and the actual data contained in the database must be planned and developed together and executed in a synchronized way. Even under a careful development discipline, the impact of changing an application data model is hard to predict. The lifetime of an application comprises changes and updates designed and tested using data, which is usually far from the real, production, data. So, coding DDL and DML SQL scripts to update database schema and data, is the usual (and hard) approach taken by developers. Such manual approach is error prone and disconnected from the real data in production, because developers may not know the exact impact of their changes. This work aims to improve the maintenance process in the context of Agile Platform by Outsystems. Our goal is to design and implement new data-model evolution features that ensure a safe support for change and a sound migration process. Our solution includes impact analysis mechanisms targeting the data model and the data itself. This provides, to developers, a safe, simple, and guided evolution process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to address and resolve the wastewater contamination problem of the Sines refinery with the main objective of optimizing the quality of this stream and reducing the costs charged to the refinery, a dynamic mass balance was developed nd implemented for ammonia and polar oil and grease (O&G) contamination in the wastewater circuit. The inadequate routing of sour gas from the sour water stripping unit and the kerosene caustic washing unit, were identified respectively as the major source of ammonia and polar substances present in the industrial wastewater effluent. For the O&G content, a predictive model was developed for the kerosene caustic washing unit, following the Projection to Latent Structures (PLS) approach. Comparison between analytical data for ammonia and polar O&G concentrations in refinery wastewater originating from the Dissolved Air Flotation (DAF) effluent and the model predictions of the dynamic mass balance calculations are in a very good agreement and highlights the dominant impact of the identified streams for the wastewater contamination levels. The ammonia contamination problem was solved by rerouting the sour gas through an existing clogged line with ammonia salts due to a non-insulated line section, while for the O&G a dynamic mass balance was implemented as an online tool, which allows for prevision of possible contamination situations and taking the required preventive actions, and can also serve as a basis for establishing relationships between the O&G contamination in the refinery wastewater with the properties of the refined crude oils and the process operating conditions. The PLS model developed could be of great asset in both optimizing the existing and designing new refinery wastewater treatment units or reuse schemes. In order to find a possible treatment solution for the spent caustic problem, an on-site pilot plant experiments for NaOH recovery from the refinery kerosene caustic washing unit effluent using an alkaline-resistant nanofiltration (NF) polymeric membrane were performed in order to evaluate its applicability for treating these highly alkaline and contaminated streams. For a constant operating pressure and temperature and adequate operating conditions, 99.9% of oil and grease rejection and 97.7% of chemical oxygen demand (COD) rejection were observed. No noticeable membrane fouling or flux decrease were registered until a volume concentration factor of 3. These results allow for NF permeate reuse instead of fresh caustic and for significant reduction of the wastewater contamination, which can result in savings of 1.5 M€ per year at the current prices for the largest Portuguese oil refinery. The capital investments needed for implementation of the required NF membrane system are less than 10% of those associated with the traditional wet air oxidation solution of the spent caustic problem. The operating costs are very similar, but can be less than half if reusing the NF concentrate in refinery pH control applications. The payback period was estimated to be 1.1 years. Overall, the pilot plant experimental results obtained and the process economic evaluation data indicate a very competitive solution through the proposed NF treatment process, which represents a highly promising alternative to conventional and existing spent caustic treatment units.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The issues concerning Crisis Situations under the scope of police activity, raised after incidents considered critical, has emerged with greater intensity during the most recent decades, posing a major challenge for police forces around the world. These are situations or events of crucial importance, involving hostage taken or barricaded individuals, in which inevitably human lives are at risk, requiring from law enforcement agencies a specific response capability, i.e., a type of intervention not framed under the parameters considered as routine, in order to obtain solutions to minimize the possibility of casualties. Because this is about impacting situations of extreme gravity, where the preservation of human lives is concerned and, in many cases, the very Rule of Law as well, we understand the need for police forces to adapt to new procedures and working methods. Such procedures are an enormously complex task that requires the coordination and articulation of several components, including not infrequently the performance of different police forces, as well as organizations and entities with varied powers and duties, which implies the need for effective management. This explains the emergence of Crisis Management Structures, imposing to determine which are their fundamental components, their importance, how they interconnect, and their major goal. The intrinsic features will also be analyzed in the aspect that we consider to be the fundamental groundwork of a Crisis Management Structure, i.e., Negotiation itself, considering it as a kind of police intervention, where a wide range of procedures feeds a channel of dialogue, aiming at minimizing the damage resultant from an extreme action, in particular, to prevent the death of any of those involved. This is in essence the path we have chosen to develop this study, trying to find out an answer to the fundamental question: What model of Crisis Management Structure should be adopted to manage a critical event involving hostage negotiation?

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O presente trabalho de investigação avalia a situação atual relativa à adoção do disposto na IAS 38 e suas consequências no reconhecimento, mensuração e divulgação obrigatória, e à divulgação voluntária, interna e externa, dos intangíveis nas empresas brasileiras cotadas na BM&FBOVESPA. Adotando a perspectiva positivista e uma abordagem quantitativa, foi utilizado o inquérito por questionário como método de recolha de dados. O questionário foi elaborado de raiz, suportado pelo arcabouço teórico. Os resultados são interpretados à luz dos argumentos da teoria dos stakeholders e da teoria institucional, que revelaram um bom potencial explicativo para o fenômeno em análise. Com a adoção das IAS/IFRS para elaboração dos balanços consolidados de empresas cotadas e não cotadas, o Brasil passa a utilizar a IAS 38, para o registro das operações envolvendo os ativos intangíveis, obrigatoriamente no exercício de 2010. A adoção deste normativo se deu essencialmente por pressões legais. No entanto, as empresas que adotaram a IAS 38 de forma voluntária validaram as razões apresentadas na literatura. Constatou-se que existe concordância quanto à nova forma de contabilização dos intangíveis prevista, sendo o grau de satisfação relativamente ao tratamento contábil dos ativos intangíveis de acordo com o novo modelo contábil adotado no Brasil (IAS 38 e CPC 04) elevado. Como principais dificuldades no reconhecimento contábil dos ativos intangíveis segundo o normativo em vigor, foram confirmados os problemas evidenciados na literatura, tais como incerteza quanto aos benefícios econômicos futuros e falta de medida com confiabilidade suficiente para o registro das transações. As empresas brasileiras acreditam ser importante que haja uma expansão da divulgação sobre intangíveis, no entanto a divulgação interna e externa está ainda em fase embrionária, não sendo uma prática generalizada. Contudo, os objetivos apresentados na literatura para a divulgação interna e externa de informações sobre intangíveis encontram suporte empírico no estudo desenvolvido. No que diz respeito aos stakeholders, conclui-se que as empresas brasileiras têm grande preocupação em atendê-los quando se trata de divulgação voluntária sobre intangíveis, entendendo que todos eles poderão beneficiar com estas informações.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Under the framework of constraint based modeling, genome-scale metabolic models (GSMMs) have been used for several tasks, such as metabolic engineering and phenotype prediction. More recently, their application in health related research has spanned drug discovery, biomarker identification and host-pathogen interactions, targeting diseases such as cancer, Alzheimer, obesity or diabetes. In the last years, the development of novel techniques for genome sequencing and other high-throughput methods, together with advances in Bioinformatics, allowed the reconstruction of GSMMs for human cells. Considering the diversity of cell types and tissues present in the human body, it is imperative to develop tissue-specific metabolic models. Methods to automatically generate these models, based on generic human metabolic models and a plethora of omics data, have been proposed. However, their results have not yet been adequately and critically evaluated and compared. This work presents a survey of the most important tissue or cell type specific metabolic model reconstruction methods, which use literature, transcriptomics, proteomics and metabolomics data, together with a global template model. As a case study, we analyzed the consistency between several omics data sources and reconstructed distinct metabolic models of hepatocytes using different methods and data sources as inputs. The results show that omics data sources have a poor overlapping and, in some cases, are even contradictory. Additionally, the hepatocyte metabolic models generated are in many cases not able to perform metabolic functions known to be present in the liver tissue. We conclude that reliable methods for a priori omics data integration are required to support the reconstruction of complex models of human cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this paper is to study the determinants of equilibrium in the market for daily funds. We use the EONIA panel database which includes daily information on the lending rates applied by contributing commercial banks. The data clearly shows an increase in both the time series volatility and the cross section dispersion of rates towards the end of the reserve maintenance period. These increases are highly correlated. With respect to quantities, we find that the volume of trade as well as the use of the standing facilities are also larger at the end of the maintenance period. Our theoretical model shows how the operational framework of monetary policy causes a reduction in the elasticity of the supply of funds by banks throughout the reserve maintenance period. This reduction in the elasticity together with market segmentation and heterogeneity are able to generate distributions for the interest rates and quantities traded with the same properties as in the data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents evidence that the existence of deposit and lending facilities combined with an averaging provision for the reserve requirement are powerful tools to stabilize the overnight rate. We reach this conclusion by comparing the behavior of this rate in Germany before and after the start of the EMU. The analysis of the German experience is useful because it allows to isolate the effects on the overnight rate of these particular instruments of monetary policy. To show that this outcome is a general conclusion and not a particular result of the German market, we develop a theoretical model of reserve management which is able to reproduce our empirical findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice of either the rate of monetary growth or the nominal interest rate as the instrument controlled by monetary authorities has both positive and normative implications for economic performance. We reexamine some of the issues related to the choice of the monetary policy instrument in a dynamic general equilibrium model exhibiting endogenous growth in which a fraction of productive government spending is financed by means of issuing currency. When we evaluate the performance of the two monetary instruments attending to the fluctuations of endogenous variables, we find that the inflation rate is less volatile under nominal interest rate targeting. Concerning the fluctuations of consumption and of the growth rate, both monetary policy instruments lead to statistically equivalent volatilities. Finally, we show that none of these two targeting procedures displays unambiguously higher welfare levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper analyses how the EU foreign policy towards Georgia changed after the Rose Revolution, reaching greater levels of involvement and assistance. It is argued that the pro-western and reformist new government in Georgia triggered a new orientation in the EU foreign policy towards the country based on a logic of appropriateness, that is EU´s values, in addition to energy interests. Comparative analysis in the Southern-Caucasus and other Eastern-European countries shows how reformist and pro-EU governments receive more EU support and assistance. This does not mean that material interest do not play an important role. However, the EU seems to be coherent with its values when regarding the European neighbourhood.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop methods for Bayesian inference in vector error correction models which are subject to a variety of switches in regime (e.g. Markov switches in regime or structural breaks). An important aspect of our approach is that we allow both the cointegrating vectors and the number of cointegrating relationships to change when the regime changes. We show how Bayesian model averaging or model selection methods can be used to deal with the high-dimensional model space that results. Our methods are used in an empirical study of the Fisher effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop methods for Bayesian inference in vector error correction models which are subject to a variety of switches in regime (e.g. Markov switches in regime or structural breaks). An important aspect of our approach is that we allow both the cointegrating vectors and the number of cointegrating relationships to change when the regime changes. We show how Bayesian model averaging or model selection methods can be used to deal with the high-dimensional model space that results. Our methods are used in an empirical study of the Fisher e ffect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The financial crisis, on the one hand, and the recourse to ‘unconventional’ monetary policy, on the other, have given a sharp jolt to perceptions of the role and status of central banks. In this paper we start with a brief ‘contrarian’ history of central banks since the second world war, which presents the Great Moderation and the restricted focus on inflation targeting as a temporary aberration from the norm. We then discuss how recent developments in fiscal and monetary policy have affected the role and status of central banks, notably their relationships with governments, before considering the environment central banks will face in the near and middle future and how they will have to change to address it.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a DSGE model in which long run inflation risk matters for social welfare. Aggregate and welfare effects of long run inflation risk are assessed under two monetary regimes: inflation targeting (IT) and price-level targeting (PT). These effects differ because IT implies base-level drift in the price level, while PT makes the price level stationary around a target price path. Under IT, the welfare cost of long run inflation risk is equal to 0.35 percent of aggregate consumption. Under PT, where long run inflation risk is largely eliminated, it is lowered to only 0.01 per cent. There are welfare gains from PT because it raises average consumption for the young and lowers consumption risk substantially for the old. These results are strongly robust to changes in the PT target horizon and fairly robust to imperfect credibility, fiscal policy, and model calibration. While the distributional effects of an unexpected transition to PT are sizeable, they are short-lived and not welfare-reducing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses the challenges faced by the empirical macroeconomist and methods for surmounting them. These challenges arise due to the fact that macroeconometric models potentially include a large number of variables and allow for time variation in parameters. These considerations lead to models which have a large number of parameters to estimate relative to the number of observations. A wide range of approaches are surveyed which aim to overcome the resulting problems. We stress the related themes of prior shrinkage, model averaging and model selection. Subsequently, we consider a particular modelling approach in detail. This involves the use of dynamic model selection methods with large TVP-VARs. A forecasting exercise involving a large US macroeconomic data set illustrates the practicality and empirical success of our approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study we elicit agents’ prior information set regarding a public good, exogenously give information treatments to survey respondents and subsequently elicit willingness to pay for the good and posterior information sets. The design of this field experiment allows us to perform theoretically motivated hypothesis testing between different updating rules: non-informative updating, Bayesian updating, and incomplete updating. We find causal evidence that agents imperfectly update their information sets. We also field causal evidence that the amount of additional information provided to subjects relative to their pre-existing information levels can affect stated WTP in ways consistent overload from too much learning. This result raises important (though familiar) issues for the use of stated preference methods in policy analysis.