923 resultados para Exponential financial models
Resumo:
This paper considers the role of social model features in the economic performance of Italy and Spain during the run-up to the Eurozone crisis, as well as the consequences of that crisis, in turn, for the two countries social models. It takes issue with the prevailing view - what I refer to as the “competitiveness thesis” - which attributes the debtor status of the two countries to a lack of competitive capacity rooted in social model features. This competitiveness thesis has been key in justifying the “liberalization plus austerity” measures that European institutions have demanded in return for financial support for Italy and Spain at critical points during the crisis. The paper challenges this prevailing wisdom. First, it reviews the characteristics of the Italian and Spanish social models and their evolution in the period prior to the crisis, revealing a far more complex, dynamic and differentiated picture than is given in the political economy literature. Second, the paper considers various ways in which social model characteristics are said to have contributed to the Eurozone crisis, finding such explanations wanting. Italy and Spain ́s debtor status was primarily the result of much broader dynamics in the Euro- zone, including capital flows from richer to poorer countries that affected economic demand, with social model features playing, at most, an ancillary role. More aggressive reforms responding to EU demands in Spain may have increased the long term social and economic costs of the crisis, whereas the political stalemate that slowed such reforms in Italy may have paradoxically mitigated these costs. The comparison of the two countries thus suggests that, in the absence of broader macro-institutional reform of the Eurozone, compliance with EU dictates may have had perverse effects.
Resumo:
This paper reviews peer-to-peer (P2P) lending, its development in the UK and other countries, and assesses the business and economic policy issues surrounding this new form of intermediation. P2P platform technology allows direct matching of borrowers’ and lenders’ diversification over a large number of borrowers without the loans having to be held on an intermediary balance sheet. P2P lending has developed rapidly in both the US and the UK, but it still represents a small fraction, less than 1%, of the stock of bank lending. In the UK – but not elsewhere – it is an important source of loans for smaller companies. We argue that P2P lending is fundamentally complementary to, and not competitive with, conventional banking. We therefore expect banks to adapt to the emergence of P2P lending, either by cooperating closely with third-party P2P lending platforms or offering their own proprietary platforms. We also argue that the full development of the sector requires much further work addressing the risks and business and regulatory issues in P2P lending, including risk communication, orderly resolution of platform failure, control of liquidity risks and minimisation of fraud, security and operational risks. This will depend on developing reliable business processes, the promotion to the full extent possible of transparency and standardisation and appropriate regulation that serves the needs of customers.
Resumo:
NPT and NVT Monte Carlo simulations are applied to models for methane and water to predict the PVT behaviour of these fluids over a wide range of temperatures and pressures. The potential models examined in this paper have previously been presented in the literature with their specific parameters optimised to fit phase coexistence data. The exponential-6 potential for methane gives generally good prediction of PVT behaviour over the full range of temperature and pressures studied with the only significant deviation from experimental data seen at high temperatures and pressures. The NSPCE water model shows very poor prediction of PVT behaviour, particularly at dense conditions. To improve this. the charge separation in the NSPCE model is varied with density. Improvements for vapour and liquid phase PVT predictions are achieved with this variation. No improvement was found in the prediction of the oxygen-oxygen radial distribution by varying charge separation under dense phase conditions. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
In Queensland, Australia, there is presently a high level of interest in long-rotation hardwood plantation investments for sawlog production, despite the consensus in Australian literature that such investments are not financially viable. Continuing genetics, silviculture and processing research, and increasing awareness about the ecosystem services generated by plantations, are anticipated to make future plantings profitable and socio-economically desirable in many parts of Queensland. Financial and economic models of hardwood plantations in Queensland are developed to test this hypothesis. The economic model accounts for carbon sequestration, salinity amelioration and other ecosystem service values of hardwood plantations. A carbon model estimates the value of carbon sequestered, while salinity and other ecosystem service values are estimated by the benefit transfer method. Where high growth rates (20-25 m(3) ha(-1) year(-1)) are achievable, long-rotation hardwood plantations are profitable in Queensland Hardwood Regions 1, 3 and 7 when rural land values are less than $2300/ha. Under optimistic assumptions, hardwood plantations growing at a rate of 15 in 3 ha-1 year 1 are financially viable in Hardwood Regions 2, 4 and 8, provided land values are less than $1600/ha. The major implication of the economic analysis is that long-rotation hardwood plantation forestry is socio-economically justified in most Hardwood Regions, even though financial returns from timber production may be negative. (c) 2003 Elsevier B.V. All rights reserved.
Resumo:
We develop foreign bank technical, cost and profit efficiency models for particular application with data envelopment analysis (DEA). Key motivations for the paper are (a) the often-observed practice of choosing inputs and outputs where the selection process is poorly explained and linkages to theory are unclear, and (b) foreign bank productivity analysis, which has been neglected in DEA banking literature. The main aim is to demonstrate a process grounded in finance and banking theories for developing bank efficiency models, which can bring comparability and direction to empirical productivity studies. We expect this paper to foster empirical bank productivity studies.
Resumo:
We develop a model for exponential decay of broadband pulses, and examine its implications for experiments on optical precursors. One of the signature features of Brillouin precursors is attenuation with a less rapid decay than that predicted by Beer's Law. Depending on the pulse parameters and the model that is adopted for the dielectric properties of the medium, the limiting z-dependence of the loss has been described as z(-1/2), z(-1/3), exponential, or, in more detailed descriptions, some combination of the above. Experimental results in the search for precursors are examined in light of the different models, and a stringent test for sub-exponential decay is applied to data on propagation of 500 femtosecond pulses through 1-5 meters of water. (C) 2005 Optical Society of America.
Resumo:
This paper elaborates the notion of balanced'' financial development that is contingent on a country's general level of development. We develop an empirical framework to address this point, referring to threshold regressions and a bootstrap test for structural shift in a growth equation. We find that countries gain less from financial activity, if the latter fails to keep up with or exceeds what would follow from a balanced expansion path. These analyses contribute to the finance and growth literature in providing empirical support for the balanced'' financial development hypothesis.
Resumo:
The recurrence interval statistics for regional seismicity follows a universal distribution function, independent of the tectonic setting or average rate of activity (Corral, 2004). The universal function is a modified gamma distribution with power-law scaling of recurrence intervals shorter than the average rate of activity and exponential decay for larger intervals. We employ the method of Corral (2004) to examine the recurrence statistics of a range of cellular automaton earthquake models. The majority of models has an exponential distribution of recurrence intervals, the same as that of a Poisson process. One model, the Olami-Feder-Christensen automaton, has recurrence statistics consistent with regional seismicity for a certain range of the conservation parameter of that model. For conservation parameters in this range, the event size statistics are also consistent with regional seismicity. Models whose dynamics are dominated by characteristic earthquakes do not appear to display universality of recurrence statistics.
Resumo:
The particle size of the bed sediments in or on many natural streams, alluvial fans, laboratory flumes, irrigation canals and mine waste deltas varies exponentially with distance along the stream. A plot of the available worldwide exponential bed particle size diminution coefficient data against stream length is presented which shows that all the data lie within a single narrow band extending over virtually the whole range of stream lengths and bed sediment particle sizes found on Earth. This correlation applies to both natural and artificial flows with both sand and gravel beds, irrespective of either the solids concentration or whether normal or reverse sorting occurs. This strongly suggests that there are common mechanisms underlying the exponential diminution of bed particles in subaerial aqueous flows of all kinds. Thus existing models of sorting and abrasion applicable to some such flows may be applicable to others. A comparison of exponential laboratory abrasion and field diminution coefficients suggests that abrasion is unlikely to be significant in gravel and sand bed streams shorter than about 10 km to 100 km, and about 500 km, respectively. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
This report outlines the derivation and application of a non-zero mean, polynomial-exponential covariance function based Gaussian process which forms the prior wind field model used in 'autonomous' disambiguation. It is principally used since the non-zero mean permits the computation of realistic local wind vector prior probabilities which are required when applying the scaled-likelihood trick, as the marginals of the full wind field prior. As the full prior is multi-variate normal, these marginals are very simple to compute.
Resumo:
An interactive hierarchical Generative Topographic Mapping (HGTM) ¸iteHGTM has been developed to visualise complex data sets. In this paper, we build a more general visualisation system by extending the HGTM visualisation system in 3 directions: bf (1) We generalize HGTM to noise models from the exponential family of distributions. The basic building block is the Latent Trait Model (LTM) developed in ¸iteKabanpami. bf (2) We give the user a choice of initializing the child plots of the current plot in either em interactive, or em automatic mode. In the interactive mode the user interactively selects ``regions of interest'' as in ¸iteHGTM, whereas in the automatic mode an unsupervised minimum message length (MML)-driven construction of a mixture of LTMs is employed. bf (3) We derive general formulas for magnification factors in latent trait models. Magnification factors are a useful tool to improve our understanding of the visualisation plots, since they can highlight the boundaries between data clusters. The unsupervised construction is particularly useful when high-level plots are covered with dense clusters of highly overlapping data projections, making it difficult to use the interactive mode. Such a situation often arises when visualizing large data sets. We illustrate our approach on a toy example and apply our system to three more complex real data sets.
Resumo:
Numerous studies find that monetary models of exchange rates cannot beat a random walk model. Such a finding, however, is not surprising given that such models are built upon money demand functions and traditional money demand functions appear to have broken down in many developed countries. In this article, we investigate whether using a more stable underlying money demand function results in improvements in forecasts of monetary models of exchange rates. More specifically, we use a sweep-adjusted measure of US monetary aggregate M1 which has been shown to have a more stable money demand function than the official M1 measure. The results suggest that the monetary models of exchange rates contain information about future movements of exchange rates, but the success of such models depends on the stability of money demand functions and the specifications of the models.
Resumo:
This study focuses on: (i) the responsiveness of the U.S. financial sector stock indices to foreign exchange (FX) and interest rate changes; and, (ii) the extent to which good model specification can enhance the forecasts from the associated models. Three models are considered. Only the error-correction model (ECM) generated efficient and consistent coefficient estimates. Furthermore, a simple zero lag model in differences which is clearly mis-specified, generated forecasts that are better than those of the ECM, even if the ECM depicts relationships that are more consistent with economic theory. In brief, FX and interest rate changes do not impact on the return-generating process of the stock indices in any substantial way. Most of the variation in the sector stock indices is associated with past variation in the indices themselves and variation in the market-wide stock index. These results have important implications for financial and economic policies.