943 resultados para Analysis theory
Resumo:
Multi-factor approaches to analysis of real estate returns have, since the pioneering work of Chan, Hendershott and Sanders (1990), emphasised a macro-variables approach in preference to the latent factor approach that formed the original basis of the arbitrage pricing theory. With increasing use of high frequency data and trading strategies and with a growing emphasis on the risks of extreme events, the macro-variable procedure has some deficiencies. This paper explores a third way, with the use of an alternative to the standard principal components approach – independent components analysis (ICA). ICA seeks higher moment independence and maximises in relation to a chosen risk parameter. We apply an ICA based on kurtosis maximisation to weekly US REIT data using a kurtosis maximising algorithm. The results show that ICA is successful in capturing the kurtosis characteristics of REIT returns, offering possibilities for the development of risk management strategies that are sensitive to extreme events and tail distributions.
Resumo:
Real estate development appraisal is a quantification of future expectations. The appraisal model relies upon the valuer/developer having an understanding of the future in terms of the future marketability of the completed development and the future cost of development. In some cases the developer has some degree of control over the possible variation in the variables, as with the cost of construction through the choice of specification. However, other variables, such as the sale price of the final product, are totally dependent upon the vagaries of the market at the completion date. To try to address the risk of a different outcome to the one expected (modelled) the developer will often carry out a sensitivity analysis on the development. However, traditional sensitivity analysis has generally only looked at the best and worst scenarios and has focused on the anticipated or expected outcomes. This does not take into account uncertainty and the range of outcomes that can happen. A fuller analysis should include examination of the uncertainties in each of the components of the appraisal and account for the appropriate distributions of the variables. Similarly, as many of the variables in the model are not independent, the variables need to be correlated. This requires a standardised approach and we suggest that the use of a generic forecasting software package, in this case Crystal Ball, allows the analyst to work with an existing development appraisal model set up in Excel (or other spreadsheet) and to work with a predetermined set of probability distributions. Without a full knowledge of risk, developers are unable to determine the anticipated level of return that should be sought to compensate for the risk. This model allows the user a better understanding of the possible outcomes for the development. Ultimately the final decision will be made relative to current expectations and current business constraints, but by assessing the upside and downside risks more appropriately, the decision maker should be better placed to make a more informed and “better”.
Resumo:
Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.
Resumo:
This paper reviews and critiques the current practice of classifying building clients according to their 'type'. An alternative approach to understanding organisations is developed in accordance with the principles of naturalistic inquiry. It is contended that the complex pluralistic clients of the 1990s can only really be understood 'from the inside'. The concept of organisational metaphors is introduced as the basis for a more sophisticated way of thinking about organisations. The various strands of organisational theory are also analyzed in terms of their underlying metaphors. Different theories are seen to bring different insights. The implicit metaphors adopted by practitioners are held to be important in that they tend to dictate the adopted approach to client briefing. This contention is illustrated by analyzing three different characterisations of the briefing process in terms of their underlying metaphors. Finally, the discussion is placed in a contemporary UK context by comparing the dominant paradigm of practice during the 1980s to that of the 1990s.
Resumo:
As the field of international business has matured, there have been shifts in the core unit of analysis. First, there was analysis at country level, using national statistics on trade and foreign direct investment (FDI). Next, the focus shifted to the multinational enterprise (MNE) and the parent’s firm specific advantages (FSAs). Eventually the MNE was analysed as a network and the subsidiary became a unit of analysis. We untangle the last fifty years of international business theory using a classification by these three units of analysis. This is the country-specific advantage (CSA) and firm-specific advantage (FSA) matrix. Will this integrative framework continue to be useful in the future? We demonstrate that this is likely as the CSA/FSA matrix permits integration of potentially useful alternative units of analysis, including the broad region of the triad. Looking forward, we develop a new framework, visualized in two matrices, to show how distance really matters and how FSAs function in international business. Key to this are the concepts of compounded distance and resource recombination barriers facing MNEs when operating across national borders.
Resumo:
The adsorption of carbon monoxide on the Pt{110} surface at coverages of 0.5 ML and 1.0 ML was investigated using quantitative low-energy electron diffraction (LEED IV) and density-functional theory (DFT). At 0.5 ML CO lifts the reconstruction of the clean surface but does not form an ordered overlayer. At the saturation coverage, 1.0 ML, a well-ordered p(2×1) superstructure with glide line symmetry is formed. It was confirmed that the CO molecules adsorb on top of the Pt atoms in the top-most substrate layer with the molecular axes tilted by ±22° with respect to the surface normal in alternating directions away from the close packed rows of Pt atoms. This is accompanied by significant lateral shifts of 0.55 Å away from the atop sites in the same direction as the tilt. The top-most substrate layer relaxes inwards by −4% with respect to the bulk-terminated atom positions, while the consecutive layers only show minor relaxations. Despite the lack of long-range order in the 0.5 ML CO layer it was possible to determine key structural parameters by LEED IV using only the intensities of the integer-order spots. At this coverage CO also adsorbs on atop sites with the molecular axis closer to the surface normal (b10°). The average substrate relaxations in each layer are similar for both coverages and consistent with DFT calculations performed for a variety of ordered structures with coverages of 1.0 ML and 0.5 ML.
Resumo:
We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.
Resumo:
The technique of relaxation of the tropical atmosphere towards an analysis in a month-season forecast model has previously been successfully exploited in a number of contexts. Here it is shown that when tropical relaxation is used to investigate the possible origin of the observed anomalies in June–July 2007, a simple dynamical model is able to reproduce the observed component of the pattern of anomalies given by an ensemble of ECMWF forecast runs. Following this result, the simple model is used for a range of experiments on time-scales of relaxation, variables and regions relaxed based on a control model run with equatorial heating in a zonal flow. A theory based on scale analysis for the large-scale tropics is used to interpret the results. Typical relationships between scales are determined from the basic equations, and for a specified diabatic heating a chain of deductions for determining the dependent variables is derived. Different critical time-scales are found for tropical relaxation of different dependent variables to be effective. Vorticity has the longest critical time-scale, typically 1.2 days. For temperature and divergence, the time-scales are 10 hours and 3 hours, respectively. However not all the tropical fields, in particular the vertical motion, are reproduced correctly by the model unless divergence is heavily damped. To obtain the correct extra-tropical fields, it is crucial to have the correct rotational flow in the subtropics to initiate the Rossby wave propagation from there. It is sufficient to relax vorticity or temperature on a time-scale comparable or less than their critical time-scales to obtain this. However if the divergent advection of vorticity is important in the Rossby Wave Source then strong relaxation of divergence is required to accurately represent the tropical forcing of Rossby waves.
Resumo:
The persistence of investment performance is a topic of perennial interest to investors. Efficient Markets theory tells us that past performance can not be used to predict future performance yet investors appear to be influenced by the historical performance in making their investment allocation decisions. The problem has been of particular interest to investors in real estate; not least because reported returns from investment in real estate are serially correlated thus implying some persistence in investment performance. This paper applies the established approach of Markov Chain analysis to investigate the relationship between past and present performance of UK real estate over the period 1981 to 1996. The data are analysed by sector, region and size. Furthermore some variations in investment performance classification are reported and the results are shown to be robust.
Resumo:
Modern Portfolio Theory (MPT) has been advocated as a more rational approach to the construction of real estate portfolios. The application of MPT can now be achieved with relative ease using the powerful facilities of modern spreadsheet, and does not necessarily need specialist software. This capability is to be found in the use of an add-in Tool now found in several spreadsheets, called an Optimiser or Solver. The value in using this kind of more sophisticated analysis feature of spreadsheets is increasingly difficult to ignore. This paper examines the use of the spreadsheet Optimiser in handling asset allocation problems. Using the Markowitz Mean-Variance approach, the paper introduces the necessary calculations, and shows, by means of an elementary example implemented in Microsoft's Excel, how the Optimiser may be used. Emphasis is placed on understanding the inputs and outputs from the portfolio optimisation process, and the danger of treating the Optimiser as a Black Box is discussed.
Plane wave discontinuous Galerkin methods for the 2D Helmholtz equation: analysis of the $p$-version
Resumo:
Plane wave discontinuous Galerkin (PWDG) methods are a class of Trefftz-type methods for the spatial discretization of boundary value problems for the Helmholtz operator $-\Delta-\omega^2$, $\omega>0$. They include the so-called ultra weak variational formulation from [O. Cessenat and B. Després, SIAM J. Numer. Anal., 35 (1998), pp. 255–299]. This paper is concerned with the a priori convergence analysis of PWDG in the case of $p$-refinement, that is, the study of the asymptotic behavior of relevant error norms as the number of plane wave directions in the local trial spaces is increased. For convex domains in two space dimensions, we derive convergence rates, employing mesh skeleton-based norms, duality techniques from [P. Monk and D. Wang, Comput. Methods Appl. Mech. Engrg., 175 (1999), pp. 121–136], and plane wave approximation theory.