877 resultados para Long run neutrality of money


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper uses dynamic programming to study the time consistency of optimal macroeconomic policy in economies with recurring public deficits. To this end, a general equilibrium recursive model introduced in Chang (1998) is extended to include govemment bonds and production. The original mode! presents a Sidrauski economy with money and transfers only, implying that the need for govemment fmancing through the inflation tax is minimal. The extended model introduces govemment expenditures and a deficit-financing scheme, analyzing the SargentWallace (1981) problem: recurring deficits may lead the govemment to default on part of its public debt through inflation. The methodology allows for the computation of the set of alI sustainable stabilization plans even when the govemment cannot pre-commit to an optimal inflation path. This is done through value function iterations, which can be done on a computeI. The parameters of the extended model are calibrated with Brazilian data, using as case study three Brazilian stabilization attempts: the Cruzado (1986), Collor (1990) and the Real (1994) plans. The calibration of the parameters of the extended model is straightforward, but its numerical solution proves unfeasible due to a dimensionality problem in the algorithm arising from limitations of available computer technology. However, a numerical solution using the original algorithm and some calibrated parameters is obtained. Results indicate that in the absence of govemment bonds or production only the Real Plan is sustainable in the long run. The numerical solution of the extended algorithm is left for future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper demonstrates that the applied monetary mo deIs - the Sidrauski-type models and the cash-in-advance models, augmented with a banking sector that supplies money substitutes services - imply trajectories which are P8,reto-Optimum restricted to a given path of the real quantity of money. As a consequence, three results follow: First, Bailey's formula to evaluate the wclfare cost of inflation is indeed accurate, if the long-run capital stock does not depend on the inflation rate and if the compensate demand is considered. Second, the relevant money demand concept for this issue - the impact of inflation on welfare - is the monetary base, Third, if the long-run capital stock depends on the inflation rate, this dependence has a second-order impact ou wclfare, and, conceptually, it is not a distortion from tite social point of vicw. These three implications moderatc some evaluations of the wclfare cost of the perfect predicted inflation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hazard and risk assessment of landslides with potentially long run-out is becoming more and more important. Numerical tools exploiting different constitutive models, initial data and numerical solution techniques are important for making the expert’s assessment more objective, even though they cannot substitute for the expert’s understanding of the site-specific conditions and the involved processes. This paper presents a depth-integrated model accounting for pore water pressure dissipation and applications both to real events and problems for which analytical solutions exist. The main ingredients are: (i) The mathematical model, which includes pore pressure dissipation as an additional equation. This makes possible to model flowslide problems with a high mobility at the beginning, the landslide mass coming to rest once pore water pressures dissipate. (ii) The rheological models describing basal friction: Bingham, frictional, Voellmy and cohesive-frictional viscous models. (iii) We have implemented simple erosion laws, providing a comparison between the approaches of Egashira, Hungr and Blanc. (iv) We propose a Lagrangian SPH model to discretize the equations, including pore water pressure information associated to the moving SPH nodes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We test for the existence of a long-run money demand relationship for the UK involving household-sector Divisia and simple sum monetary indexes for the period from 1977 to 2008. We construct our Divisia index using non-break-adjusted levels and break-adjusted flows following the Bank of England. We test for cointegration between the real Divisia and simple sum indexes, their corresponding opportunity cost measures, real income and real share prices. Our results support the existence of a long-run money demand relationship for both the Divisia and simple sum indexes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The literature on bond markets and interest rates has focused largely on the term structure of interest rates, specifically, on the so-called expectations hypothesis. At the same time, little is known about the nature of the spread of the interest rates in the money market beyond the fact that such spreads are generally unstable. However, with the evolution of complex financial instruments, it has become imperative to identify the time series process that can help one accurately forecast such spreads into the future. This article explores the nature of the time series process underlying the spread between three-month and one-year US rates, and concludes that the movements in this spread over time is best captured by a GARCH(1,1) process. It also suggests the use of a relatively long term measure of interest rate volatility as an explanatory variable. This exercise has gained added importance in view of the revelation that GARCH based estimates of option prices consistently outperform the corresponding estimates based on the stylized Black-Scholes algorithm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the returns to UK government bonds before, during and between the phases of quantitative easing to identify the side effects for the market itself. We show that the onset of QE led to a sustained reduction in the costs of trading and removed some return regularities. However, controlling for a wide range of market activity, including issuance and QE announcements, we find evidence that investors could have earned excess returns after costs by trading in response to the purchase auction calendar. Drawing on economic theory, we explore the implications of these findings for both the efficiency of the market and the costs of government debt management in both the short and long run.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation examines the behavior of the exchange rate under two different scenarios. The first one is characterized by, relatively, low inflation or a situation where prices adjust sluggishly. The second is a high inflation economy where prices respond very rapidly even to unanticipated shocks. In the first one, following a monetary expansion, the exchange rate overshoots, i.e. the nominal exchange rate depreciates at a faster pace than the price level. Under high levels of inflation, prices change faster than the exchange rate so the exchange rate undershoots its long run equilibrium value.^ The standard work in this area, Dornbusch (1976), explains the overshooting process in the context of perfect capital mobility and sluggish adjustment in the goods market. A monetary expansion will make the exchange rate increase beyond its long run equilibrium value. This dissertation expands on Dornbusch's model and provides an analysis of the exchange rate under conditions of currency substitution and price flexibility, characteristics of the Peruvian economy during the hyper inflation process that took place at the end of the 1980's. The results of the modified Dornbusch model reveal that, given a monetary expansion, the change in the price level will be larger than the change in the exchange rate if prices react more than proportionally to the monetary shock.^ We will expect this over-reaction in circumstances of high inflation when the velocity of money is increasing very rapidly. Increasing velocity of money, gives rise to a higher relative price variability which in turn contributes to the appearance of new financial (and also non-financial) instruments that report a higher return than the exchange rate, causing people to switch their demand for foreign exchange to this new assets. In the context of currency substitution, economic agents hoard and use foreign exchange as a store of value. The big decline in output originated by hyper inflation induces people to sell this hoarded money to finance current expenses, increasing the supply of foreign exchange in the market. Both, the decrease in demand and the increase in supply reduce the price of foreign exchange i.e. the real exchange rate. The findings mentioned above are tested using Peruvian data for the period January 1985-July 1990, the results of the econometric estimation confirm our findings in the theoretical model. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates the short-run dynamics and long-run equilibrium relationship between residential electricity demand and factors influencing demand - per capita income, price of electricity, price of kerosene oil and price of liquefied petroleum gas - using annual data for Sri Lanka for the period, 1960-2007. The study uses unit root, cointegration and error-correction models. The long-run demand elasticities of income, own price and price of kerosene oil (substitute) were estimated to be 0.78, - 0.62, and 0.14 respectively. The short-run elasticities for the same variables were estimated to be 032, - 0.16 and 0.10 respectively. Liquefied petroleum (LP) gas is a substitute for electricity only in the short-run with an elasticity 0.09. The main findings of the paper support the following (1) increasing the price of electricity is not the most effective tool to reduce electricity consumption (2) existing subsidies on electricity consumption can be removed without reducing government revenue (3) the long-run income elasticity of demand shows that any future increase in household incomes is likely to significantly increase the demand for electricity and(4) any power generation plans which consider only current per capita consumption and population growth should be revised taking into account the potential future income increases in order to avoid power shortages ill the country.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the role of social capital on the reduction of short and long run negative health effects associated with stress, as well as indicators of burnout among police officers. Despite the large volume of research on either social capital or the health effects of stress, the interaction of these factors remains an underexplored topic. In this empirical analysis we aim to reduce such a shortcoming focusing on a highly stressful and emotionally draining work environment, namely law enforcement agents who perform as an essential part of maintaining modern society. Using a multivariate regression analysis focusing on three different proxies of health and three proxies for social capital conducting also several robustness checks, we find strong evidence that increased levels of social capital is highly correlated with better health outcomes. Additionally we observe that while social capital at work is very important, social capital in the home environment and work-life balance are even more important. From a policy perspective, our findings suggest that work and stress programs should actively encourage employees to build stronger social networks as well as incorporate better working/home life arrangements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evaluation, selection and finally decision making are all among important issues, which engineers face in long run of projects. Engineers implement mathematical and nonmathematical methods to make accurate and correct decisions, whenever needed. As extensive as these methods are, effects of any selected method on outputs achieved and decisions made are still suspicious. This is more controversial and challengeable, where evaluation is made among non-quantitative alternatives. In civil engineering and construction management problems, criteria include both quantitative and qualitative ones, such as aesthetic, construction duration, building and operation costs, and environmental considerations. As the result, decision making frequently takes place among non-quantitative alternatives. It should be noted that traditional comparison methods, including clear-cut and inflexible mathematics, have always been criticized. This paper demonstrates a brief review of traditional methods of evaluating alternatives. It also offers a new decision making method using, fuzzy calculations. The main focus of this research is some engineering issues, which have flexible nature and vague borders. Suggested method provides analyzability of evaluation for decision makers. It is also capable to overcome multi criteria and multi-referees problems. In order to ease calculations, a program named DeMA is introduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The economic environment of today can be characterized as highly dynamic and competitive if not being in a constant flux. Globalization and the Information Technology (IT) revolution are perhaps the main contributing factors to this observation. While companies have to some extent adapted to the current business environment, new pressures such as the recent increase in environmental awareness and its likely effects on regulations are underway. Hence, in the light of market and competitive pressures, companies must constantly evaluate and if necessary update their strategies to sustain and increase the value they create for shareholders (Hunt and Morgan, 1995; Christopher and Towill, 2002). One way to create greater value is to become more efficient in producing and delivering goods and services to customers, which can lead to a strategy known as cost leadership (Porter, 1980). Even though Porter (1996) notes that in the long run cost leadership may not be a sufficient strategy for competitive advantage, operational efficiency is certainly necessary and should therefore be on the agenda of every company. ----- ----- ----- Better workflow management, technology, and resource utilization can lead to greater internal operational efficiency, which explains why, for example, many companies have recently adopted Enterprise Resource Planning (ERP) Systems: integrated softwares that streamline business processes. However, as today more and more companies are approaching internal operational excellence, the focus for finding inefficiencies and cost saving opportunities is moving beyond the boundaries of the firm. Today many firms in the supply chain are engaging in collaborative relationships with customers, suppliers, and third parties (services) in an attempt to cut down on costs related to for example, inventory, production, as well as to facilitate synergies. Thus, recent years have witnessed fluidity and blurring regarding organizational boundaries (Coad and Cullen, 2006). ----- ----- ----- The Information Technology (IT) revolution of the late 1990’s has played an important role in bringing organizations closer together. In their efforts to become more efficient, companies first integrated their information systems to speed up transactions such as ordering and billing. Later collaboration on a multidimensional scale including logistics, production, and Research & Development became evident as companies expected substantial benefits from collaboration. However, one could also argue that the recent popularity of the concepts falling under Supply Chain Management (SCM) such as Vendor Managed Inventory, Collaborative Planning, Replenishment, and Forecasting owe to the marketing efforts of software vendors and consultants who provide these solutions. Nevertheless, reports from professional organizations as well as academia indicate that the trend towards interorganizational collaboration is gaining wider ground. For example, the ARC Advisory Group, a research organization on supply chain solutions, estimated that the market for SCM, which includes various kinds of collaboration tools and related services, is going to grow at an annual rate of 7.4% during the years 2004-2008, reaching to $7.4 billion in 2008 (Engineeringtalk 2004).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The uniformization method (also known as randomization) is a numerically stable algorithm for computing transient distributions of a continuous time Markov chain. When the solution is needed after a long run or when the convergence is slow, the uniformization method involves a large number of matrix-vector products. Despite this, the method remains very popular due to its ease of implementation and its reliability in many practical circumstances. Because calculating the matrix-vector product is the most time-consuming part of the method, overall efficiency in solving large-scale problems can be significantly enhanced if the matrix-vector product is made more economical. In this paper, we incorporate a new relaxation strategy into the uniformization method to compute the matrix-vector products only approximately. We analyze the error introduced by these inexact matrix-vector products and discuss strategies for refining the accuracy of the relaxation while reducing the execution cost. Numerical experiments drawn from computer systems and biological systems are given to show that significant computational savings are achieved in practical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Orthopaedic fracture fixation implants are increasingly being designed using accurate 3D models of long bones based on computer tomography (CT). Unlike CT, magnetic resonance imaging (MRI) does not involve ionising radiation and is therefore a desirable alternative to CT. This study aims to quantify the accuracy of MRI-based 3D models compared to CT-based 3D models of long bones. The femora of five intact cadaver ovine limbs were scanned using a 1.5T MRI and a CT scanner. Image segmentation of CT and MRI data was performed using a multi-threshold segmentation method. Reference models were generated by digitising the bone surfaces free of soft tissue with a mechanical contact scanner. The MRI- and CT-derived models were validated against the reference models. The results demonstrated that the CT-based models contained an average error of 0.15mm while the MRI-based models contained an average error of 0.23mm. Statistical validation shows that there are no significant differences between 3D models based on CT and MRI data. These results indicate that the geometric accuracy of MRI based 3D models was comparable to that of CT-based models and therefore MRI is a potential alternative to CT for generation of 3D models with high geometric accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design of pre-contoured fracture fixation implants (plates and nails) that correctly fit the anatomy of a patient utilises 3D models of long bones with accurate geometric representation. 3D data is usually available from computed tomography (CT) scans of human cadavers that generally represent the above 60 year old age group. Thus, despite the fact that half of the seriously injured population comes from the 30 year age group and below, virtually no data exists from these younger age groups to inform the design of implants that optimally fit patients from these groups. Hence, relevant bone data from these age groups is required. The current gold standard for acquiring such data–CT–involves ionising radiation and cannot be used to scan healthy human volunteers. Magnetic resonance imaging (MRI) has been shown to be a potential alternative in the previous studies conducted using small bones (tarsal bones) and parts of the long bones. However, in order to use MRI effectively for 3D reconstruction of human long bones, further validations using long bones and appropriate reference standards are required. Accurate reconstruction of 3D models from CT or MRI data sets requires an accurate image segmentation method. Currently available sophisticated segmentation methods involve complex programming and mathematics that researchers are not trained to perform. Therefore, an accurate but relatively simple segmentation method is required for segmentation of CT and MRI data. Furthermore, some of the limitations of 1.5T MRI such as very long scanning times and poor contrast in articular regions can potentially be reduced by using higher field 3T MRI imaging. However, a quantification of the signal to noise ratio (SNR) gain at the bone - soft tissue interface should be performed; this is not reported in the literature. As MRI scanning of long bones has very long scanning times, the acquired images are more prone to motion artefacts due to random movements of the subject‟s limbs. One of the artefacts observed is the step artefact that is believed to occur from the random movements of the volunteer during a scan. This needs to be corrected before the models can be used for implant design. As the first aim, this study investigated two segmentation methods: intensity thresholding and Canny edge detection as accurate but simple segmentation methods for segmentation of MRI and CT data. The second aim was to investigate the usability of MRI as a radiation free imaging alternative to CT for reconstruction of 3D models of long bones. The third aim was to use 3T MRI to improve the poor contrast in articular regions and long scanning times of current MRI. The fourth and final aim was to minimise the step artefact using 3D modelling techniques. The segmentation methods were investigated using CT scans of five ovine femora. The single level thresholding was performed using a visually selected threshold level to segment the complete femur. For multilevel thresholding, multiple threshold levels calculated from the threshold selection method were used for the proximal, diaphyseal and distal regions of the femur. Canny edge detection was used by delineating the outer and inner contour of 2D images and then combining them to generate the 3D model. Models generated from these methods were compared to the reference standard generated using the mechanical contact scans of the denuded bone. The second aim was achieved using CT and MRI scans of five ovine femora and segmenting them using the multilevel threshold method. A surface geometric comparison was conducted between CT based, MRI based and reference models. To quantitatively compare the 1.5T images to the 3T MRI images, the right lower limbs of five healthy volunteers were scanned using scanners from the same manufacturer. The images obtained using the identical protocols were compared by means of SNR and contrast to noise ratio (CNR) of muscle, bone marrow and bone. In order to correct the step artefact in the final 3D models, the step was simulated in five ovine femora scanned with a 3T MRI scanner. The step was corrected using the iterative closest point (ICP) algorithm based aligning method. The present study demonstrated that the multi-threshold approach in combination with the threshold selection method can generate 3D models from long bones with an average deviation of 0.18 mm. The same was 0.24 mm of the single threshold method. There was a significant statistical difference between the accuracy of models generated by the two methods. In comparison, the Canny edge detection method generated average deviation of 0.20 mm. MRI based models exhibited 0.23 mm average deviation in comparison to the 0.18 mm average deviation of CT based models. The differences were not statistically significant. 3T MRI improved the contrast in the bone–muscle interfaces of most anatomical regions of femora and tibiae, potentially improving the inaccuracies conferred by poor contrast of the articular regions. Using the robust ICP algorithm to align the 3D surfaces, the step artefact that occurred by the volunteer moving the leg was corrected, generating errors of 0.32 ± 0.02 mm when compared with the reference standard. The study concludes that magnetic resonance imaging, together with simple multilevel thresholding segmentation, is able to produce 3D models of long bones with accurate geometric representations. The method is, therefore, a potential alternative to the current gold standard CT imaging.