783 resultados para Allocation approaches


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The position of Real Estate within a multi-asset portfolio has received considerable attention recently. Previous research has concentrated on the percentage holding property would achieve given its risk/return characteristics. Such studies have invariably used Modern Portfolio Theory and these approaches have been criticised for both the quality of the real estate data and problems with the methodology itself. The first problem is now well understood, and the second can be addressed by the use of realistic constraints on asset holdings. This paper takes a different approach. We determine the level of return that Real Estate needs to achieve to justify an allocation within the multi asset portfolio. In order to test the importance of the quality of the data we use historic appraisal based and desmoothed returns to examine the sensitivity of the results. Consideration is also given to the Holding period and the imposition of realistic constraints on the asset holdings in order to model portfolios held by pension fund investors. We conclude, using several benchmark levels of portfolio risk and return, that using appraisal based data the required level of return for Real Estate was less than that achieved over the period 1972-1993. The use of desmoothed series can reverse this result at the highest levels of desmoothing although within a restricted holding period Real Estate offered returns in excess of those required to enter the portfolio and might have a role to play in the multi-asset portfolio.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tepe Pardis, a significant Neolithic–Chalcolithic site on the Tehran Plain in Iran, is, like many sites in the area, under threat from development. The site contains detailed evidence of (1) the Neolithic–Chalcolithic transition, (2) an Iron Age cemetery and (3) how the inhabitants adapted to an unstable fan environment through resource exploitation (of clay deposits for relatively large-scale ceramic production by c. 5000 BC, and importantly, possible cutting of artificial water channels). Given this significance, models have been produced to better understand settlement distribution and change in the region. However, these models must be tied into a greater understanding of the impact of the geosphere on human development over this period. Forming part of a larger project focusing on the transformation of simple, egalitarian Neolithic communities into more hierarchical Chalcolithic ones, the site has become the focus of a multidisciplinary project to address this issue. Through the combined use of sedimentary and limited pollen analysis, radiocarbon and optically stimulated luminescence dating (the application of the last still rare in Iran), a greater understanding of the impact of alluvial fan development on human settlement through alluviation and the development of river channel sequences is possible. Notably, the findings presented here suggest that artificial irrigation was occurring at the site as early as 6.7±0.4 ka (4300–5100 BC).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental policy in the United Kingdom (UK) is witnessing a shift from command-and-control approaches towards more innovation-orientated environmental governance arrangements. These governance approaches are required which create institutions which support actors within a domain for learning not only about policy options, but also about their own interests and preferences. The need for construction actors to understand, engage and influence this process is critical to establishing policies which support innovation that satisfies each constituent’s needs. This capacity is particularly salient in an era where the expanding raft of environmental regulation is ushering in system-wide innovation in the construction sector. In this paper, the Code for Sustainable Homes (the Code) in the UK is used to demonstrate the emergence and operation of these new governance arrangements. The Code sets out a significant innovation challenge for the house-building sector with, for example, a requirement that all new houses must be zero-carbon by 2016. Drawing upon boundary organisation theory, the journey from the Code as a government aspiration, to the Code as a catalyst for the formation of the Zero Carbon Hub, a new institution, is traced and discussed. The case study reveals that the ZCH has demonstrated boundary organisation properties in its ability to be flexible to the needs and constraints of its constituent actors, yet robust enough to maintain and promote a common identity across regulation and industry boundaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models of root system growth emerged in the early 1970s, and were based on mathematical representations of root length distribution in soil. The last decade has seen the development of more complex architectural models and the use of computer-intensive approaches to study developmental and environmental processes in greater detail. There is a pressing need for predictive technologies that can integrate root system knowledge, scaling from molecular to ensembles of plants. This paper makes the case for more widespread use of simpler models of root systems based on continuous descriptions of their structure. A new theoretical framework is presented that describes the dynamics of root density distributions as a function of individual root developmental parameters such as rates of lateral root initiation, elongation, mortality, and gravitropsm. The simulations resulting from such equations can be performed most efficiently in discretized domains that deform as a result of growth, and that can be used to model the growth of many interacting root systems. The modelling principles described help to bridge the gap between continuum and architectural approaches, and enhance our understanding of the spatial development of root systems. Our simulations suggest that root systems develop in travelling wave patterns of meristems, revealing order in otherwise spatially complex and heterogeneous systems. Such knowledge should assist physiologists and geneticists to appreciate how meristem dynamics contribute to the pattern of growth and functioning of root systems in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several methods are examined which allow to produce forecasts for time series in the form of probability assignments. The necessary concepts are presented, addressing questions such as how to assess the performance of a probabilistic forecast. A particular class of models, cluster weighted models (CWMs), is given particular attention. CWMs, originally proposed for deterministic forecasts, can be employed for probabilistic forecasting with little modification. Two examples are presented. The first involves estimating the state of (numerically simulated) dynamical systems from noise corrupted measurements, a problem also known as filtering. There is an optimal solution to this problem, called the optimal filter, to which the considered time series models are compared. (The optimal filter requires the dynamical equations to be known.) In the second example, we aim at forecasting the chaotic oscillations of an experimental bronze spring system. Both examples demonstrate that the considered time series models, and especially the CWMs, provide useful probabilistic information about the underlying dynamical relations. In particular, they provide more than just an approximation to the conditional mean.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Asset allocation is concerned with the development of multi--‐asset portfolio strategies that are likely to meet an investor’s objectives based on the interaction of expected returns, risk, correlation and implementation from a range of distinct asset classes or beta sources. Challenges associated with the discipline are often particularly significant in private markets. Specifically, composition differences between the ‘index’ or ‘benchmark’ universe and the investible universe mean that there can often be substantial and meaningful deviations between the investment characteristics implied in asset allocation decisions and those delivered by investment teams. For example, while allocation decisions are often based on relatively low--‐risk diversified real estate ‘equity’ exposure, implementation decisions frequently include exposure to higher risk forms of the asset class as well as investments in debt based instruments. These differences can have a meaningful impact on the contribution of the asset class to the overall portfolio and, therefore, lead to a potential misalignment between asset allocation decisions and implementation. Despite this, the key conclusion from this paper is not that real estate investors should become slaves to a narrowly defined mandate based on IPD / NCREIF or other forms of benchmark replication. The discussion suggests that such an approach would likely lead to the underutilization of real estate in multi--‐asset portfolio strategies. Instead, it is that to achieve asset allocation alignment, real estate exposure should be divided into multiple pools representing distinct forms of the asset class. In addition, the paper suggests that associated investment guidelines and processes should be collaborative and reflect the portfolio wide asset allocation objectives of each pool. Further, where appropriate they should specifically target potential for ‘additional’ beta or, more marginally, ‘alpha’.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recognition of their competitive vulnerability, a set of special rules have been devised for managing sectors such as steel and cement within the EU ETS. These rules basically seek to set sector specific performance benchmarks and reward top performers. However, the steel sector as a whole will receive the vast majority of its allowances for free in Phase III. Perceptions of competitive vulnerability have been largely based on inherently hypothetical analyses which rely heavily on counterfactual scenario and abatement cost estimates often provided by firms themselves. This paper diverges from these approaches by providing a qualitative assessment of the two key reasons underpinning the competitive vulnerability argument of the EU Steel Companies based on interviews and case study involving the three largest producers of steel within the EU – AcerlorMittal, Corus, and ThyssenKrupp. We find that these arguments provide only partial and weak justifications for competitive loss and discriminatory treatment in the EUETS. This strategy is difficult to counter by governments due to information asymmetry; and it appears to have proved very successful insofar as it has helped the industry to achieve free allocation in Phases I-III of EU ETS by playing up the risk of carbon leakage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses key contextual differences and similarities in a comparative study on brownfield regeneration in England and Japan. Over the last decade, the regeneration of large-scale ‘flagship’ projects has been a primary focus in England, and previous research has discussed policy issues and key barriers at these sites. However, further research is required to explore specific barriers associated with problematic ‘hardcore’ sites suffering from long-term dereliction due to site-specific obstacles such as contamination and fragmented ownership. In comparison with England, brownfield regeneration is a relatively new urban agenda in Japan. Japan has less experience in terms of promoting redevelopment of brownfield sites at national level and the specific issues of ‘hardcore’ sites have been under-researched. The paper reviews and highlights important issues in comparing the definitions, national policy frameworks and the current stock of brownfields.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Platelets in the circulation are triggered by vascular damage to activate, aggregate and form a thrombus that prevents excessive blood loss. Platelet activation is stringently regulated by intracellular signalling cascades, which when activated inappropriately lead to myocardial infarction and stroke. Strategies to address platelet dysfunction have included proteomics approaches which have lead to the discovery of a number of novel regulatory proteins of potential therapeutic value. Global analysis of platelet proteomes may enhance the outcome of these studies by arranging this information in a contextual manner that recapitulates established signalling complexes and predicts novel regulatory processes. Platelet signalling networks have already begun to be exploited with interrogation of protein datasets using in silico methodologies that locate functionally feasible protein clusters for subsequent biochemical validation. Characterization of these biological systems through analysis of spatial and temporal organization of component proteins is developing alongside advances in the proteomics field. This focused review highlights advances in platelet proteomics data mining approaches that complement the emerging systems biology field. We have also highlighted nucleated cell types as key examples that can inform platelet research. Therapeutic translation of these modern approaches to understanding platelet regulatory mechanisms will enable the development of novel anti-thrombotic strategies.

Relevância:

20.00% 20.00%

Publicador: