989 resultados para project portfolio view


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The success of any diversification strategy depends upon the quality of the estimated correlation between assets. It is well known, however, that there is a tendency for the average correlation among assets to increase when the market falls and vice-versa. Thus, assuming that the correlation between assets is a constant over time seems unrealistic. Nonetheless, these changes in the correlation structure as a consequence of changes in the market’s return suggests that correlation shifts can be modelled as a function of the market return. This is the idea behind the model of Spurgin et al (2000), which models the beta or systematic risk, of the asset as a function of the returns in the market. This is an approach that offers particular attractions to fund managers as it suggest ways by which they can adjust their portfolios to benefit from changes in overall market conditions. In this paper the Spurgin et al (2000) model is applied to 31 real estate market segments in the UK using monthly data over the period 1987:1 to 2000:12. The results show that a number of market segments display significant negative correlation shifts, while others show significantly positive correlation shifts. Using this information fund managers can make strategic and tactical portfolio allocation decisions based on expectations of market volatility alone and so help them achieve greater portfolio performance overall and especially during different phases of the real estate cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In estimating the inputs into the Modern Portfolio Theory (MPT) portfolio optimisation problem, it is usual to use equal weighted historic data. Equal weighting of the data, however, does not take account of the current state of the market. Consequently this approach is unlikely to perform well in any subsequent period as the data is still reflecting market conditions that are no longer valid. The need for some return-weighting scheme that gives greater weight to the most recent data would seem desirable. Therefore, this study uses returns data which are weighted to give greater weight to the most recent observations to see if such a weighting scheme can offer improved ex-ante performance over that based on un-weighted data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An EPRSC ‘Partnerships for Public Engagement’ scheme 2010. FEC 122,545.56/UoR 10K everything and nothing is a performance and workshop which engages the public creatively with mathematical concepts: the Poincare conjecture, the shape of the universe, topology, and the nature of infinity are explored through an original, thought provoking piece of music theatre. Jorge Luis Borges' short story 'The Library of Babel' and the aviator Amelia Earhart’s attempt to circumnavigate the globe combine to communicate to audience key mathematical concepts of Poincare’s conjecture. The project builds on a 2008 EPSRC early development project (EP/G001650/1) and is led by an interdisciplinary team the19thstep consisting of composer Dorothy Ker, sculptor Kate Allen and mathematician Marcus du Sautoy. everything and nothing has been devised by Dorothy Ker and Kate Allen, is performed by percussionist Chris Brannick, mezzo soprano Lucy Stevens and sound designer Kelcey Swain. The UK tour targets arts-going audiences, from the Green Man Festival to the British Science Festival. Each performance is accompanied with a workshop led by Topologist Katie Steckles. Alongside the performances and workshops is a website, http://www.everythingandnothingproject.com/ The Public engagement evaluation and monitoring for the project are carried out by evaluator Bea Jefferson. The project is significant in its timely relation to contemporary mathematics and arts-science themes delivering an extensive programme of public engagement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of MPT in the construction real estate portfolios has two serious limitations when used in an ex-ante framework: (1) the intertemporal instability of the portfolio weights and (2) the sharp deterioration in performance of the optimal portfolios outside the sample period used to estimate asset mean returns. Both problems can be traced to wide fluctuations in sample means Jorion (1985). Thus the use of a procedure that ignores the estimation risk due to the uncertain in mean returns is likely to produce sub-optimal results in subsequent periods. This suggests that the consideration of the issue of estimation risk is crucial in the use of MPT in developing a successful real estate portfolio strategy. Therefore, following Eun & Resnick (1988), this study extends previous ex-ante based studies by evaluating optimal portfolio allocations in subsequent test periods by using methods that have been proposed to reduce the effect of measurement error on optimal portfolio allocations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the potential benefits and limitations of equal and value-weighted diversification using as the example the UK institutional property market. To achieve this it uses the largest sample (392) of actual property returns that is currently available, over the period 1981 to 1996. To evaluate these issues two approaches are adopted; first, an analysis of the correlations within the sectors and regions and secondly simulations of property portfolios of increasing size constructed both naively and with value-weighting. Using these methods it is shown that the extent of possible risk reduction is limited because of the high positive correlations between assets in any portfolio, even when naively diversified. It is also shown that portfolios exhibit high levels of variability around the average risk, suggesting that previous work seriously understates the number of properties needed to achieve a satisfactory level of diversification. The results have implications for the development and maintenance of a property portfolio because they indicate that the achievable level of risk reduction depends upon the availability of assets, the weighting system used and the investor’s risk tolerance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite a number of papers that discuss the advantages of increased size on risk levels in real estate portfolios there is remarkably little empirical evidence based on actual portfolios. The objective of this paper is to remedy this deficiency by examining the portfolio risk of a large sample of actual property data over the period 1981 to 1996. The results show that all that can be said is that portfolios of properties of a large size, on the average, tend to have lower risks than small sized portfolios. More importantly portfolios of a few properties can have very high or very low risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The case for property has typically rested on the application of modern portfolio theory (MPT), in that property has been shown to offer increased diversification benefits within a multi asset portfolio without hurting portfolio returns especially for lower risk portfolios. However this view is based upon the use of historic, usually appraisal based, data for property. Recent research suggests strongly that such data significantly underestimates the risk characteristics of property, because appraisals explicitly or implicitly smooth out much of the real volatility in property returns. This paper examines the portfolio diversification effects of including property in a multi-asset portfolio, using UK appraisal based (smoothed) data and several derived de-smoothed series. Having considered the effects of de-smoothing, we then consider the inclusion of a further low risk asset (cash) in order to investigate further whether property's place in a low risk portfolio is maintained. The conclusions of this study are that the previous supposed benefits of including property have been overstated. Although property may still have a place in a 'balanced' institutional portfolio, the case for property needs to be reassessed and not be based simplistically on the application of MPT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaining public acceptance is one of the main issues with large-scale low-carbon projects such as hydropower development. It has been recommended by the World Commission on Dams that to gain public acceptance, publicinvolvement is necessary in the decision-making process (WCD, 2000). As financially-significant actors in the planning and implementation of large-scale hydropowerprojects in developing country contexts, the paper examines the ways in which publicinvolvement may be influenced by international financial institutions. Using the casestudy of the NamTheun2HydropowerProject in Laos, the paper analyses how publicinvolvement facilitated by the Asian Development Bank had a bearing on procedural and distributional justice. The paper analyses the extent of publicparticipation and the assessment of full social and environmental costs of the project in the Cost-Benefit Analysis conducted during the projectappraisal stage. It is argued that while efforts were made to involve the public, there were several factors that influenced procedural and distributional justice: the late contribution of the Asian Development Bank in the projectappraisal stage; and the issue of non-market values and discount rate to calculate the full social and environmental costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent UK changes in the number of students entering higher education, and in the nature of financial support, highlight the complexity of students’ choices about human capital investments. Today’s students have to focus not on the relatively narrow issue of how much academic effort to invest, but instead on the more complicated issue of how to invest effort in pursuit of ‘employability skills’, and how to signal such acquisitions in the context of a highly competitive graduate jobs market. We propose a framework aimed specifically at students’ investment decisions, which encompasses corner solutions for both borrowing and employment while studying.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sensitivity to the horizontal resolution of the climate, anthropogenic climate change, and seasonal predictive skill of the ECMWF model has been studied as part of Project Athena—an international collaboration formed to test the hypothesis that substantial progress in simulating and predicting climate can be achieved if mesoscale and subsynoptic atmospheric phenomena are more realistically represented in climate models. In this study the experiments carried out with the ECMWF model (atmosphere only) are described in detail. Here, the focus is on the tropics and the Northern Hemisphere extratropics during boreal winter. The resolutions considered in Project Athena for the ECMWF model are T159 (126 km), T511 (39 km), T1279 (16 km), and T2047 (10 km). It was found that increasing horizontal resolution improves the tropical precipitation, the tropical atmospheric circulation, the frequency of occurrence of Euro-Atlantic blocking, and the representation of extratropical cyclones in large parts of the Northern Hemisphere extratropics. All of these improvements come from the increase in resolution from T159 to T511 with relatively small changes for further resolution increases to T1279 and T2047, although it should be noted that results from this very highest resolution are from a previously untested model version. Problems in simulating the Madden–Julian oscillation remain unchanged for all resolutions tested. There is some evidence that increasing horizontal resolution to T1279 leads to moderate increases in seasonal forecast skill during boreal winter in the tropics and Northern Hemisphere extratropics. Sensitivity experiments are discussed, which helps to foster a better understanding of some of the resolution dependence found for the ECMWF model in Project Athena

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A significant development in the Washington DC arts and Humanities Commission programme, the 5x5 project represented the first publicly funded arts project of this type in the US Capital. Following an International call a panel selected 20 curators who in turn selected 5 artists. All curators programmes and research were presented and 5 curators projects selected. Research into control issues surrounding the import and export of water from Japan were used to set up a project in which public were invited to put one of one thousand small droplets of this imported water onto Cherry Blossom Trees. Many of the interactions were recorded onto the database that also included documentation of sites which have vested political or national interests in the Earthquake and Fukushima Diaichi disaster in Washington DC itself. Hundreds of participants took part in the project over one week.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.