847 resultados para REAL INTEREST-RATE
Resumo:
A new Bayesian algorithm for retrieving surface rain rate from Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) over the ocean is presented, along with validations against estimates from the TRMM Precipitation Radar (PR). The Bayesian approach offers a rigorous basis for optimally combining multichannel observations with prior knowledge. While other rain-rate algorithms have been published that are based at least partly on Bayesian reasoning, this is believed to be the first self-contained algorithm that fully exploits Bayes’s theorem to yield not just a single rain rate, but rather a continuous posterior probability distribution of rain rate. To advance the understanding of theoretical benefits of the Bayesian approach, sensitivity analyses have been conducted based on two synthetic datasets for which the “true” conditional and prior distribution are known. Results demonstrate that even when the prior and conditional likelihoods are specified perfectly, biased retrievals may occur at high rain rates. This bias is not the result of a defect of the Bayesian formalism, but rather represents the expected outcome when the physical constraint imposed by the radiometric observations is weak owing to saturation effects. It is also suggested that both the choice of the estimators and the prior information are crucial to the retrieval. In addition, the performance of the Bayesian algorithm herein is found to be comparable to that of other benchmark algorithms in real-world applications, while having the additional advantage of providing a complete continuous posterior probability distribution of surface rain rate.
Resumo:
The kinetics of the reactions of the atoms O(P-3), S(P-3), Se(P-3), and Te((3)p) with a series of alkenes are examined for correlations relating the logarithms of the rate coefficients to the energies of the highest occupied molecular orbitals (HOMOs) of the alkenes. These correlations may be employed to predict rate coefficients from the calculated HOMO energy of any other alkene of interest. The rate coefficients obtained from the correlations were used to formulate structure-activity relations (SARs) for reactions of O((3)p), S(P-3), Se (P-3), and Te((3)p) with alkenes. A comparison of the values predicted by both the correlations and the SARs with experimental data where they exist allowed us to assess the reliability of our method. We demonstrate the applicability of perturbation frontier molecular orbital theory to gas-phase reactions of these atoms with alkenes. The correlations are apparently not applicable to reactions of C(P-3), Si(P-3), N(S-4), and Al(P-2) atoms with alkenes, a conclusion that could be explained in terms of a different mechanism for reaction of these atoms.
Resumo:
This paper, examines whether the asset holdings and weights of an international real estate portfolio using exchange rate adjusted returns are essentially the same or radically different from those based on unadjusted returns. The results indicate that the portfolio compositions produced by exchange rate adjusted returns are markedly different from those based on unadjusted returns. However following the introduction of the single currency the differences in portfolio composition are much less pronounced. The findings have a practical consequence for the investor because they suggest that following the introduction of the single currency international investors can concentrate on the real estate fundamentals when making their portfolio choices, rather than worry about the implications of exchange rate risk.
Resumo:
The reduction of portfolio risk is important to all investors but is particularly important to real estate investors as most property portfolios are generally small. As a consequence, portfolios are vulnerable to a significant risk of under-performing the market, or a target rate of return and so investors may be exposing themselves to greater risk than necessary. Given the potentially higher risk of underperformance from owning only a few properties, we follow the approach of Vassal (2001) and examine the benefits of holding more properties in a real estate portfolio. Using Monte Carlo simulation and the returns from 1,728 properties in the IPD database, held over the 10-year period from 1995 to 2004, the results show that increases in portfolio size offers the possibility of a more stable and less volatile return pattern over time, i.e. down-side risk is diminished with increasing portfolio size. Nonetheless, increasing portfolio size has the disadvantage of restricting the probability of out-performing the benchmark index by a significant amount. In other words, although increasing portfolio size reduces the down-side risk in a portfolio, it also decreases its up-side potential. Be that as it may, the results provide further evidence that portfolios with large numbers of properties are always preferable to portfolios of a smaller size.
Resumo:
The “case for real estate” in the mixed-asset portfolio is a topic of continuing interest to practitioners and academics. The argument is typically made by comparing efficient frontiers of portfolio with real estate to those that exclude real estate. However, most investors will have held inefficient portfolios. Thus, when analysing the real estate’s place in the mixed-asset portfolio it seems illogical to do so by comparing the difference in risk-adjusted performance between efficient portfolios, which few if any investor would have held. The approach adopted here, therefore, is to compare the risk-adjusted performance of a number of mixed-asset portfolios without real estate (which may or not be efficient) with a very large number of mixed-asset portfolios that include real estate (which again may or may not be efficient), to see the proportion of the time when there is an increase in risk-adjusted performance, significant or otherwise using appraisal-based and de-smoothed annual data from 1952-2003. So to the question how often does the addition of private real estate lead to increases the risk-adjusted performance compared with mixed-asset portfolios without real estate the answer is almost all the time. However, significant increases are harder to find. Additionally, a significant increase in risk-adjusted performance can come from either reductions in portfolio risk or increases in return depending on the investors’ initial portfolio structure. In other words, simply adding real estate to a mixed-asset portfolio is not enough to ensure significant increases in performance as the results are dependent on the percentage added and the proper reallocation of the initial portfolio mix in the expanded portfolio.
Resumo:
The performance of various statistical models and commonly used financial indicators for forecasting securitised real estate returns are examined for five European countries: the UK, Belgium, the Netherlands, France and Italy. Within a VAR framework, it is demonstrated that the gilt-equity yield ratio is in most cases a better predictor of securitized returns than the term structure or the dividend yield. In particular, investors should consider in their real estate return models the predictability of the gilt-equity yield ratio in Belgium, the Netherlands and France, and the term structure of interest rates in France. Predictions obtained from the VAR and univariate time-series models are compared with the predictions of an artificial neural network model. It is found that, whilst no single model is universally superior across all series, accuracy measures and horizons considered, the neural network model is generally able to offer the most accurate predictions for 1-month horizons. For quarterly and half-yearly forecasts, the random walk with a drift is the most successful for the UK, Belgian and Dutch returns and the neural network for French and Italian returns. Although this study underscores market context and forecast horizon as parameters relevant to the choice of the forecast model, it strongly indicates that analysts should exploit the potential of neural networks and assess more fully their forecast performance against more traditional models.
Resumo:
Persistence of property returns is a topic of perennial interest to fund managers as it suggests that choosing those properties that will perform well in the future is as simple as looking at those that performed well in the past. Consequently, much effort has been expended to determine if such a rule exists in the real estate market. This paper extends earlier studies in US, Australian, and UK markets in two ways. First, this study applies the same methodology originally used in Young and Graff (1996) making the results directly comparable with those in the US and Australian property markets. Second, this study uses a much longer and larger database covering all commercial property data available from the Investment Property Databank (IPD), for the years 1981 to 2002 for as many as 216,758 individual property returns. While the performance results of this study mimic the US and Australian results of greater persistence in the extreme first and fourth quartiles, they also evidence persistence in the moderate second and third quartiles, a notable departure from previous studies. Likewise patterns across property type, location, time, and holding period are remarkably similar leading to the conjecture that behaviors in the practice of commercial real estate investment management are themselves deeply rooted and persistent and perhaps influenced for good or ill by agency effects
Resumo:
A near real-time flood detection algorithm giving a synoptic overview of the extent of flooding in both urban and rural areas, and capable of working during night-time and day-time even if cloud was present, could be a useful tool for operational flood relief management. The paper describes an automatic algorithm using high resolution Synthetic Aperture Radar (SAR) satellite data that builds on existing approaches, including the use of image segmentation techniques prior to object classification to cope with the very large number of pixels in these scenes. Flood detection in urban areas is guided by the flood extent derived in adjacent rural areas. The algorithm assumes that high resolution topographic height data are available for at least the urban areas of the scene, in order that a SAR simulator may be used to estimate areas of radar shadow and layover. The algorithm proved capable of detecting flooding in rural areas using TerraSAR-X with good accuracy, classifying 89% of flooded pixels correctly, with an associated false positive rate of 6%. Of the urban water pixels visible to TerraSAR-X, 75% were correctly detected, with a false positive rate of 24%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 57% and 18% respectively.
Resumo:
This article presents findings and seeks to establish the theoretical markers that indicate the growing importance of fact-based drama in screen and theatre performance to the wider Anglophone culture. During the final decade of the twentieth century and the opening one of the twenty-first, television docudrama and documentary theatre have grown in visibility and importance in the UK, providing key responses to social, cultural and political change over the millennial period. Actors were the prime focus for the enquiry principally because so little research has been done into the special demands that fact-based performance makes on them. The main emphasis in actor training (in the UK at any rate) is, as it always has been, on preparation for fictional drama. Preparation in acting schools is also heavily geared towards stage performance. Our thesis was that performers called upon to play the roles of real people, in whatever medium, have added responsibilities both towards history and towards real individuals and their families. Actors must engage with ethical questions whether they like it or not, and we found them keenly aware of this. In the course of the research, we conducted 30 interviews with a selection of actors ranging from the experienced to the recently-trained. We also interviewed a few industry professionals and actor trainers. Once the interviews started it was clear that actors themselves made little or no distinction between how they set about their work for television and film. The essential disciplines for work in front of the camera, they told us, are the same whether the camera is electronic or photographic. Some adjustments become necessary, of course in the multi-camera TV studio. But much serious drama for the screen is made on film anyway. We found it was also the case that young actors now tend to get their first paid employment before a camera rather than on a stage. The screen-before-stage tendency, along with the fundamental re-shaping that has gone on in the British theatre since at least the early 1980s, had implications for actor training. We have also found that theatre work still tends to be most valued by actors. For all the actors we interviewed, theatre was what they liked doing best because it was there they could practice and develop their skills, there they could work most collectively towards performance, and there they could more directly experience audience feedback in the real time of the stage play. The current world of television has been especially constrained in regard to rehearsal time in comparison to theatre (and, to a lesser extent, film). This has also affected actors’ valuation of their work. Theatre is, and is not, the most important medium in which they find work. Theatre is most important spiritually and intellectually, because in theatre is collaborative, intensive, and involving; theatre is not as important in financial and career terms, because it is not as lucrative and not as visible to a large public as acting for the screen. Many actors took the view that, for all the industrial differences that do affect them and inevitably interest the academic, acting for the visible media of theatre, film and television involved fundamentally the same process with slightly different emphases.
Resumo:
The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.
Resumo:
The persistence of investment performance is a topic of perennial interest to investors. Efficient Markets theory tells us that past performance can not be used to predict future performance yet investors appear to be influenced by the historical performance in making their investment allocation decisions. The problem has been of particular interest to investors in real estate; not least because reported returns from investment in real estate are serially correlated thus implying some persistence in investment performance. This paper applies the established approach of Markov Chain analysis to investigate the relationship between past and present performance of UK real estate over the period 1981 to 1996. The data are analysed by sector, region and size. Furthermore some variations in investment performance classification are reported and the results are shown to be robust.
Resumo:
The literature on investors’ holding periods for equities and bonds suggest that high transaction costs are associated with longer holding periods. Return volatility, by contrast, is associated with short-term trading and hence shorter holding periods. High transaction costs and the perceived illiquidity of the real estate market leads to an expectation of longer holding periods. Further, work on depreciation and obsolescence might suggest that there is an optimal holding period. However, there is little empirical work in the area. In this paper, data from the Investment Property Databank are used to investigate sales rate and holding period for UK institutional real estate between 1981 and 1994. Sales rates are investigated using the Cox proportional hazards framework. The results show longer holding periods than those claimed by investors. There are marked differences by type of property and sales rates vary over time. Contemporaneous returns are positively associated with an increase in the rate of sale. The results shed light on investor behaviour.
Resumo:
Linear models of market performance may be misspecified if the market is subdivided into distinct regimes exhibiting different behaviour. Price movements in the US Real Estate Investment Trusts and UK Property Companies Markets are explored using a Threshold Autoregressive (TAR) model with regimes defined by the real rate of interest. In both US and UK markets, distinctive behaviour emerges, with the TAR model offering better predictive power than a more conventional linear autoregressive model. The research points to the possibility of developing trading rules to exploit the systematically different behaviour across regimes.
Resumo:
A parallel processor architecture based on a communicating sequential processor chip, the transputer, is described. The architecture is easily linearly extensible to enable separate functions to be included in the controller. To demonstrate the power of the resulting controller some experimental results are presented comparing PID and full inverse dynamics on the first three joints of a Puma 560 robot. Also examined are some of the sample rate issues raised by the asynchronous updating of inertial parameters, and the need for full inverse dynamics at every sample interval is questioned.
Resumo:
Simulating spiking neural networks is of great interest to scientists wanting to model the functioning of the brain. However, large-scale models are expensive to simulate due to the number and interconnectedness of neurons in the brain. Furthermore, where such simulations are used in an embodied setting, the simulation must be real-time in order to be useful. In this paper we present NeMo, a platform for such simulations which achieves high performance through the use of highly parallel commodity hardware in the form of graphics processing units (GPUs). NeMo makes use of the Izhikevich neuron model which provides a range of realistic spiking dynamics while being computationally efficient. Our GPU kernel can deliver up to 400 million spikes per second. This corresponds to a real-time simulation of around 40 000 neurons under biologically plausible conditions with 1000 synapses per neuron and a mean firing rate of 10 Hz.