978 resultados para Macro-econometric model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the 'fundamental' component of euro area sovereign bond yield spreads, i.e. the part of bond spreads that can be justified by country-specific economic factors, euro area economic fundamentals, and international influences. The yield spread decomposition is achieved using a multi-market, no-arbitrage affine term structure model with a unique pricing kernel. More specifically, we use the canonical representation proposed by Joslin, Singleton, and Zhu (2011) and introduce next to standard spanned factors a set of unspanned macro factors, as in Joslin, Priebsch, and Singleton (2013). The model is applied to yield curve data from Belgium, France, Germany, Italy, and Spain over the period 2005-2013. Overall, our results show that economic fundamentals are the dominant drivers behind sovereign bond spreads. Nevertheless, shocks unrelated to the fundamental component of the spread have played an important role in the dynamics of bond spreads since the intensification of the sovereign debt crisis in the summer of 2011

Relevância:

30.00% 30.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As field determinations take much effort, it would be useful to be able to predict easily the coefficients describing the functional response of free-living predators, the function relating food intake rate to the abundance of food organisms in the environment. As a means easily to parameterise an individual-based model of shorebird Charadriiformes populations, we attempted this for shorebirds eating macro-invertebrates. Intake rate is measured as the ash-free dry mass (AFDM) per second of active foraging; i.e. excluding time spent on digestive pauses and other activities, such as preening. The present and previous studies show that the general shape of the functional response in shorebirds eating approximately the same size of prey across the full range of prey density is a decelerating rise to a plateau, thus approximating the Holling type 11 ('disc equation') formulation. But field studies confirmed that the asymptote was not set by handling time, as assumed by the disc equation, because only about half the foraging time was spent in successfully or unsuccessfully attacking and handling prey, the rest being devoted to searching. A review of 30 functional responses showed that intake rate in free-living shorebirds varied independently of prey density over a wide range, with the asymptote being reached at very low prey densities (< 150/m(-2)). Accordingly, most of the many studies of shorebird intake rate have probably been conducted at or near the asymptote of the functional response, suggesting that equations that predict intake rate should also predict the asymptote. A multivariate analysis of 468 'spot' estimates of intake rates from 26 shorebirds identified ten variables, representing prey and shorebird characteristics, that accounted for 81 % of the variance in logarithm-transformed intake rate. But four-variables accounted for almost as much (77.3 %), these being bird size, prey size, whether the bird was an oystercatcher Haematopus ostralegus eating mussels Mytilus edulis, or breeding. The four variable equation under-predicted, on average, the observed 30 estimates of the asymptote by 11.6%, but this discrepancy was reduced to 0.2% when two suspect estimates from one early study in the 1960s were removed. The equation therefore predicted the observed asymptote very successfully in 93 % of cases. We conclude that the asymptote can be reliably predicted from just four easily measured variables. Indeed, if the birds are not breeding and are not oystercatchers eating mussels, reliable predictions can be obtained using just two variables, bird and prey sizes. A multivariate analysis of 23 estimates of the half-asymptote constant suggested they were smaller when prey were small but greater when the birds were large, especially in oystercatchers. The resulting equation could be used to predict the half-asymptote constant, but its predictive power has yet to be tested. As well as predicting the asymptote of the functional response, the equations will enable research workers engaged in many areas of shorebird ecology and behaviour to estimate intake rate without the need for conventional time-consuming field studies, including species for which it has not yet proved possible to measure intake rate in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the source country determinants of FDI into Japan. The paper highlights certain methodological and theoretical weaknesses in the previous literature and offers some explanations for hitherto ambiguous results. Specifically, the paper highlights the importance of panel data analysis, and the identification of fixed effects in the analysis rather than simply pooling the data. Indeed, we argue that many of the results reported elsewhere are a feature of this mis-specification. To this end, pooled, fixed effects and random effects estimates are compared. The results suggest that FDI into Japan is inversely related to trade flows, such that trade and FDI are substitutes. Moreover, the results also suggest that FDI increases with home country political and economic stability. The paper also shows that previously reported results, regarding the importance of exchange rates, relative borrowing costs and labour costs in explaining FDI flows, are sensitive to the econometric specification and estimation approach. The paper also discusses the importance of these results within a policy context. In recent years Japan has sought to attract FDI, though many firms still complain of barriers to inward investment penetration in Japan. The results show that cultural and geographic distance are only of marginal importance in explaining FDI, and that the results are consistent with the market-seeking explanation of FDI. As such, the attitude to risk in the source country is strongly related to the size of FDI flows to Japan. © 2007 The Authors Journal compilation © 2007 Blackwell Publishing Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a simple three-sector model that seeks to establish links between structural change in employment and the level of economic development. The model is a modified version of that of Rowthorn - Wells (1987). The theoretical analysis is supplemented with simple econometric tests which illustrate how the modified Rowthorn - Wells model can be used (i) to motivate empirical estimates of the link between the level of development and structures of employment, and (ii) to illustrate structural distortions under command economies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a simple three-sector model explaining the structural change in employment, which is a modified version of Rowthorn-Wells (1987). We supplement the theoretical analysis with simple econometric tests, which illustrate how the modified Rowthorn-Wells model can be used to (i) motivate empirical estimates of the link between the level of development and structures of employment, (ii) illustrate structural distortions under the command economies, and the structural adjustment that happened during the post-Communist transition. We demonstrate that in the case of these economies, the transition process leads to an adjustment to the employment structures predicted by the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To investigate the coexistence of ocular microvascular and systemic macrovascular abnormalities in early stage, newly diagnosed and previously untreated normal tension glaucoma patients (NTG). Methods: Retinal vascular reactivity to flickering light was assessed in 19 NTG and 28 age-matched controls by means of dynamic retinal vessel analysis (IMEDOS GmbH, Jena, Germany). Using a newly developed computational model, the entire dynamic vascular response profile to flicker light was imaged and used for analysis. In addition, assessments of carotid intima-media thickness (IMT) and pulse wave analysis (PWA) were conducted on all participants, along with blood pressure (BP) measurements and blood analyses for lipid metabolism markers. Results: Patients with NTG demonstrated an increased right and left carotid IMT (p = 0.015, p = 0.045) and an elevated PWA augmentation index (p = 0.017) in comparison with healthy controls, along with an enhanced retinal arterial constriction response (p = 0.028), a steeper retinal arterial constriction slope (p = 0.031) and a reduced retinal venous dilation response (p = 0.026) following flicker light stimulation. Conclusions: Early stage, newly diagnosed, NTG patients showed signs of subclinical vascular abnormalities at both macro- and micro-vascular levels, highlighting the need to consider multi-level circulation-related pathologies in the development and progression of this type of glaucoma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the congruency of planning between organizational structure and process, through an evaluation and planning model known as the Micro/Macro Dynamic Planning Grid. The model compares day-to-day planning within an organization to planning imposed by organizational administration and accrediting agencies. A survey instrument was developed to assess the micro and macro sociological analysis elements utilized by an organization.^ The Micro/Macro Dynamic Planning Grid consists of four quadrants. Each quadrant contains characteristics that reflect the interaction between the micro and macro elements of planning, objectives and goals within an organization. The Over Macro/Over Micro, Quadrant 1, contains attributes that reflect a tremendous amount of action and ongoing adjustments, typical of an organization undergoing significant changes in either leadership, program and/or structure. Over Macro/Under Micro, Quadrant 2, reflects planning characteristics found in large, bureaucratic systems with little regard given to the workings of their component parts. Under Macro/Under Micro, Quadrant 3, reflects the uncooperative, uncoordinated organization, one that contains a multiplicity of viewpoints, language, objectives and goals. Under Macro/Under Micro, Quadrant 4 represents the worst case scenario for any organization. The attributes of this quadrant are very reactive, chaotic, non-productive and redundant.^ There were three phases to the study: development of the initial instrument, pilot testing the initial instrument and item revision, and administration and assessment of the refined instrument. The survey instrument was found to be valid and reliable for the purposes and audiences herein described.^ In order to expand the applicability of the instrument to other organizational settings, the survey was administered to three professional colleges within a university.^ The first three specific research questions collectively answered, in the affirmative, the basic research question: Can the Micro/Macro Dynamic Planning Grid be applied to an organization through an organizational development tool? The first specific question: Can an instrument be constructed that applies the Micro/Macro Dynamic Planning Grid? The second specific research question: Is the constructed instrument valid and reliable? The third specific research question: Does an instrument that applies the Micro/Macro Dynamic Planning Grid assess congruency of micro and macro planning, goals and objectives within an organization? The fourth specific research question: What are the differences in the responses based on roles and responsibilities within an organization? involved statistical analysis of the response data and comparisons obtained with the demographic data. (Abstract shortened by UMI.) ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate of fatal crashes in Florida has remained significantly higher than the national average for the last several years. The 2003 statistics from the National Highway Traffic Safety Administration (NHTSA), the latest available, show a fatality rate in Florida of 1.71 per 100 million vehicle-miles traveled compared to the national average of 1.48 per 100 million vehicle-miles traveled. The objective of this research is to better understand the driver, environmental, and roadway factors that affect the probability of injury severity in Florida. ^ In this research, the ordered logit model was used to develop six injury severity models; single-vehicle and two-vehicle crashes on urban freeways and urban principal arterials and two-vehicle crashes at urban signalized and unsignalized intersections. The data used in this research included all crashes that occurred on the state highway system for the period from 2001 to 2003 in the Southeast Florida region, which includes the Miami-Dade, Broward and Palm Beach Counties.^ The results of the analysis indicate that the age group and gender of the driver at fault were significant factors of injury severity risk across all models. The greatest risk of severe injury was observed for the age groups 55 to 65 and 66 and older. A positive association between injury severity and the race of the driver at fault was also found. Driver at fault of Hispanic origin was associated with a higher risk of severe injury for both freeway models and for the two-vehicle crash model on arterial roads. A higher risk of more severe injury crash involvement was also found when an African-American was the at fault driver on two-vehicle crashes on freeways. In addition, the arterial class was also found to be positively associated with a higher risk of severe crashes. Six-lane divided arterials exhibited the highest injury severity risk of all arterial classes. The lowest severe injury risk was found for one way roads. Alcohol involvement by the driver at fault was also found to be a significant risk of severe injury for the single-vehicle crash model on freeways. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. ^ Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. ^ Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. ^ Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exchange rate economics has achieved substantial development in the past few decades. Despite extensive research, a large number of unresolved problems remain in the exchange rate debate. This dissertation studied three puzzling issues aiming to improve our understanding of exchange rate behavior. Chapter Two used advanced econometric techniques to model and forecast exchange rate dynamics. Chapter Three and Chapter Four studied issues related to exchange rates using the theory of New Open Economy Macroeconomics. Chapter Two empirically examined the short-run forecastability of nominal exchange rates. It analyzed important empirical regularities in daily exchange rates. Through a series of hypothesis tests, a best-fitting fractionally integrated GARCH model with skewed student-t error distribution was identified. The forecasting performance of the model was compared with that of a random walk model. Results supported the contention that nominal exchange rates seem to be unpredictable over the short run in the sense that the best-fitting model cannot beat the random walk model in forecasting exchange rate movements. Chapter Three assessed the ability of dynamic general-equilibrium sticky-price monetary models to generate volatile foreign exchange risk premia. It developed a tractable two-country model where agents face a cash-in-advance constraint and set prices to the local market; the exogenous money supply process exhibits time-varying volatility. The model yielded approximate closed form solutions for risk premia and real exchange rates. Numerical results provided quantitative evidence that volatile risk premia can endogenously arise in a new open economy macroeconomic model. Thus, the model had potential to rationalize the Uncovered Interest Parity Puzzle. Chapter Four sought to resolve the consumption-real exchange rate anomaly, which refers to the inability of most international macro models to generate negative cross-correlations between real exchange rates and relative consumption across two countries as observed in the data. While maintaining the assumption of complete asset markets, this chapter introduced endogenously segmented asset markets into a dynamic sticky-price monetary model. Simulation results showed that such a model could replicate the stylized fact that real exchange rates tend to move in an opposite direction with respect to relative consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most research on stock prices is based on the present value model or the more general consumption-based model. When applied to real economic data, both of them are found unable to account for both the stock price level and its volatility. Three essays here attempt to both build a more realistic model, and to check whether there is still room for bubbles in explaining fluctuations in stock prices. In the second chapter, several innovations are simultaneously incorporated into the traditional present value model in order to produce more accurate model-based fundamental prices. These innovations comprise replacing with broad dividends the more narrow traditional dividends that are more commonly used, a nonlinear artificial neural network (ANN) forecasting procedure for these broad dividends instead of the more common linear forecasting models for narrow traditional dividends, and a stochastic discount rate in place of the constant discount rate. Empirical results show that the model described above predicts fundamental prices better, compared with alternative models using linear forecasting process, narrow dividends, or a constant discount factor. Nonetheless, actual prices are still largely detached from fundamental prices. The bubble-like deviations are found to coincide with business cycles. The third chapter examines possible cointegration of stock prices with fundamentals and non-fundamentals. The output gap is introduced to form the non-fundamental part of stock prices. I use a trivariate Vector Autoregression (TVAR) model and a single equation model to run cointegration tests between these three variables. Neither of the cointegration tests shows strong evidence of explosive behavior in the DJIA and S&P 500 data. Then, I applied a sup augmented Dickey-Fuller test to check for the existence of periodically collapsing bubbles in stock prices. Such bubbles are found in S&P data during the late 1990s. Employing econometric tests from the third chapter, I continue in the fourth chapter to examine whether bubbles exist in stock prices of conventional economic sectors on the New York Stock Exchange. The ‘old economy’ as a whole is not found to have bubbles. But, periodically collapsing bubbles are found in Material and Telecommunication Services sectors, and the Real Estate industry group.