861 resultados para real case
Resumo:
This paper seeks to increase the understanding of the performance implications for investors who choose to combine an unlisted real estate portfolio (in this case German Spezialfonds) with a (global) listed real estate element. We call this a “blended” approach to real estate allocations. For the avoidance of doubt, in this paper we are dealing purely with real estate equity (listed and unlisted) allocations, and do not incorporate real estate debt (listed or unlisted) or direct property into the process. A previous paper (Moss and Farrelly 2014) showed the benefits of the blended approach as it applied to UK Defined Contribution Pension Schemes. The catalyst for this paper has been the recent attention focused on German pension fund allocations, which have a relatively low (real estate) equity content, and a high bond content. We have used the MSCI Spezialfonds Index as a proxy for domestic German institutional real estate allocations, and the EPRA Global Developed Index as a proxy for a global listed real estate allocation. We also examine whether a rules based trading strategy, in this case Trend Following, can improve the risk adjusted returns above those of a simple buy and hold strategy for our sample period 2004-2015. Our findings are that by blending a 30% global listed portfolio with a 70% allocation (as opposed to a typical 100% weighting) to Spezialfonds, the real estate allocation returns increase from 2.88% p.a. to 5.42% pa. Volatility increases, but only to 6.53%., but there is a noticeable impact on maximum drawdown which increases to 19.4%. By using a Trend Following strategy raw returns are improved from 2.88% to 6.94% p.a. , The Sharpe Ratio increases from 1.05 to 1.49 and the Maximum Drawdown ratio is now only 1.83% compared to 19.4% using a buy and hold strategy . Finally, adding this (9%) real estate allocation to a mixed asset portfolio allocation typical for German pension funds there is an improvement in both the raw return (from 7.66% to 8.28%) and the Sharpe Ratio (from 0.91 to 0.98).
Resumo:
Current state-of-the-art global climate models produce different values for Earth’s mean temperature. When comparing simulations with each other and with observations it is standard practice to compare temperature anomalies with respect to a reference period. It is not always appreciated that the choice of reference period can affect conclusions, both about the skill of simulations of past climate, and about the magnitude of expected future changes in climate. For example, observed global temperatures over the past decade are towards the lower end of the range of CMIP5 simulations irrespective of what reference period is used, but exactly where they lie in the model distribution varies with the choice of reference period. Additionally, we demonstrate that projections of when particular temperature levels are reached, for example 2K above ‘pre-industrial’, change by up to a decade depending on the choice of reference period. In this article we discuss some of the key issues that arise when using anomalies relative to a reference period to generate climate projections. We highlight that there is no perfect choice of reference period. When evaluating models against observations, a long reference period should generally be used, but how long depends on the quality of the observations available. The IPCC AR5 choice to use a 1986-2005 reference period for future global temperature projections was reasonable, but a case-by-case approach is needed for different purposes and when assessing projections of different climate variables. Finally, we recommend that any studies that involve the use of a reference period should explicitly examine the robustness of the conclusions to alternative choices.
Resumo:
The purpose of this paper is to develop a Bayesian analysis for nonlinear regression models under scale mixtures of skew-normal distributions. This novel class of models provides a useful generalization of the symmetrical nonlinear regression models since the error distributions cover both skewness and heavy-tailed distributions such as the skew-t, skew-slash and the skew-contaminated normal distributions. The main advantage of these class of distributions is that they have a nice hierarchical representation that allows the implementation of Markov chain Monte Carlo (MCMC) methods to simulate samples from the joint posterior distribution. In order to examine the robust aspects of this flexible class, against outlying and influential observations, we present a Bayesian case deletion influence diagnostics based on the Kullback-Leibler divergence. Further, some discussions on the model selection criteria are given. The newly developed procedures are illustrated considering two simulations study, and a real data previously analyzed under normal and skew-normal nonlinear regression models. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The Southwest region of the Bahia state in Brazil hosts the largest uranium reserve of the country (100 kton in uranium, only), plus the cities of Caetite, Lagoa Real and Igapora. In this work, aim was at the investigation of uranium burdens on residents of these cities by using teeth as bioindicators, as a contribution for possible radiation protection measures. Thus, a total of 41 human teeth were collected, plus 50 from an allegedly uranium free area (the control region). Concentrations of uranium in teeth from residents of 5- to 87-y old were determined by means of a high-resolution inductively coupled plasma mass spectrometer (ICP-MS). The highest uranium concentration in teeth was measured from samples belonging to residents of Caetite (median equal to 16 ppb). Assuming that the uranium concentrations in teeth and bones are similar within 10-20% (for children and young adults), it concluded that uranium body levels in residents of Caetite are at least one order of magnitude higher than the worldwide average. This finding led to conclude that daily ingestion of uranium, from food and water, is equally high.
Resumo:
The present paper presents the study of the decolourisation of real textile effluent by constant current electrolysis in a flow-cell using a DSAO type material. The effect of using different anode materials (Ti/Ru0.3Ti0.7O2; Ti/Ir0.3Ti0.7O2; Ti/RuxSn1-xO2, where X = 0.1, 0.2 or 0.3) on the efficiency of colour removal is discussed. Attempts to perform galvanostatic oxidation (40 and 60 mA cm(-2)) on the as-received effluent demonstrate that colour removal and total organic carbon (TOC) removal are limited. In this case the greatest degree of colour removal is achieved when anode containing 90% SnO2 is used. If the conductivity of the effluent is increased by adding NaCl (0.1 mol L-1) appreciable colour/TOC removal is observed. The efficiencies of colour and TOC removal are discussed in terms of the energy per order (E-EO/kWhm(-3) order(-1)) and energy consumption (E-C/kW h kg(-1) TOC), respectively. Finally, the extent of colour removal is compared to consent levels presented in the literature. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Igiogbe cultural heritage has existed since the founding of Bini kingdom without any controversy; however since the Supreme Court decision in Idehen v Idehen the issue of Igiogbe has assumed new dimensions. Igiogbe - the house in which a Benin man lived and died devolves on his first son absolutely; but since the beginning of 20th century litigation as to the real meaning of Igiogbe and who is entitled to inheritance thereof began to increase. Controversies and increase in litigation over Igiogbe has occasioned a shift in the practice, the Bini’s are not conscious of some of these changes, most of them (Bini’s) still claim Igiogbe practices is rigidly adhered to. This study on Igiogbe inheritance in Bini kingdom is therefore carried out with a view to bringing out the changes in Igiogbe cultural practice using legal and anthropological tools to examine the changes. While laying the foundation for the discussion on the main research object the researcher examined the origin and status of customary law in Nigeria. There after I examined Igiogbe inheritance in Bini kingdom. Igiogbe and the issue of first son were critically analyzed with the aid of the research questions bringing out the changes in Igiogbe concept from traditional practice to modern practice. Study shows Igiogbe practice is still relevant in modern Bini kingdom, however, the shift and changes in practice of this cultural milieu has lead me to ask some fundamental questions which I intend to answer in the broader research work in future.
Resumo:
Wooden railway sleeper inspections in Sweden are currently performed manually by a human operator; such inspections are based on visual analysis. Machine vision based approach has been done to emulate the visual abilities of human operator to enable automation of the process. Through this process bad sleepers are identified, and a spot is marked on it with specific color (blue in the current case) on the rail so that the maintenance operators are able to identify the spot and replace the sleeper. The motive of this thesis is to help the operators to identify those sleepers which are marked by color (spots), using an “Intelligent Vehicle” which is capable of running on the track. Capturing video while running on the track and segmenting the object of interest (spot) through this vehicle; we can automate this work and minimize the human intuitions. The video acquisition process depends on camera position and source light to obtain fine brightness in acquisition, we have tested 4 different types of combinations (camera position and source light) here to record the video and test the validity of proposed method. A sequence of real time rail frames are extracted from these videos and further processing (depending upon the data acquisition process) is done to identify the spots. After identification of spot each frame is divided in to 9 regions to know the particular region where the spot lies to avoid overlapping with noise, and so on. The proposed method will generate the information regarding in which region the spot lies, based on nine regions in each frame. From the generated results we have made some classification regarding data collection techniques, efficiency, time and speed. In this report, extensive experiments using image sequences from particular camera are reported and the experiments were done using intelligent vehicle as well as test vehicle and the results shows that we have achieved 95% success in identifying the spots when we use video as it is, in other method were we can skip some frames in pre-processing to increase the speed of video but the segmentation results we reduced to 85% and the time was very less compared to previous one. This shows the validity of proposed method in identification of spots lying on wooden railway sleepers where we can compromise between time and efficiency to get the desired result.
Resumo:
Of the ways in which agent behaviour can be regulated in a multiagent system, electronic contracting – based on explicit representation of different parties' responsibilities, and the agreement of all parties to them – has significant potential for modern industrial applications. Based on this assumption, the CONTRACT project aims to develop and apply electronic contracting and contract-based monitoring and verification techniques in real world applications. This paper presents results from the initial phase of the project, which focused on requirements solicitation and analysis. Specifically, we survey four use cases from diverse industrial applications, examine how they can benefit from an agent-based electronic contracting infrastructure and outline the technical requirements that would be placed on such an infrastructure. We present the designed CONTRACT architecture and describe how it may fulfil these requirements. In addition to motivating our work on the contract-based infrastructure, the paper aims to provide a much needed community resource in terms of use case themselves and to provide a clear commercial context for the development of work on contract-based system.
Resumo:
In the past, the focus of drainage design was on sizing pipes and storages in order to provide sufficient network capacity. This traditional approach, together with computer software and technical guidance, had been successful for many years. However, due to rapid population growth and urbanisation, the requirements of a “good” drainage design have also changed significantly. In addition to water management, other aspects such as environmental impacts, amenity values and carbon footprint have to be considered during the design process. Going forward, we need to address the key sustainability issues carefully and practically. The key challenge of moving from simple objectives (e.g. capacity and costs) to complicated objectives (e.g. capacity, flood risk, environment, amenity etc) is the difficulty to strike a balance between various objectives and to justify potential benefits and compromises. In order to assist decision makers, we developed a new decision support system for drainage design. The system consists of two main components – a multi-criteria evaluation framework for drainage systems and a multi-objective optimisation tool. The evaluation framework is used for the quantification of performance, life-cycle costs and benefits of different drainage systems. The optimisation tool can search for feasible combinations of design parameters such as the sizes, order and type of drainage components that maximise multiple benefits. In this paper, we will discuss real-world application of the decision support system. A number of case studies have been developed based on recent drainage projects in China. We will use the case studies to illustrate how the evaluation framework highlights and compares the pros and cons of various design options. We will also discuss how the design parameters can be optimised based on the preferences of decision makers. The work described here is the output of an EngD project funded by EPSRC and XP Solutions.
Resumo:
We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.
Resumo:
As a highly urbanized and flood prone region, Flanders has experienced multiple floods causing significant damage in the past. In response to the floods of 1998 and 2002 the Flemish Environment Agency, responsible for managing 1 400 km of unnavigable rivers, started setting up a real time flood forecasting system in 2003. Currently the system covers almost 2 000 km of unnavigable rivers, for which flood forecasts are accessible online (www.waterinfo.be). The forecasting system comprises more than 1 000 hydrologic and 50 hydrodynamic models which are supplied with radar rainfall, rainfall forecasts and on-site observations. Forecasts for the next 2 days are generated hourly, while 10 day forecasts are generated twice a day. Additionally, twice daily simulations based on percentile rainfall forecasts (from EPS predictions) result in uncertainty bands for the latter. Subsequent flood forecasts use the most recent rainfall predictions and observed parameters at any time while uncertainty on the longer-term is taken into account. The flood forecasting system produces high resolution dynamic flood maps and graphs at about 200 river gauges and more than 3 000 forecast points. A customized emergency response system generates phone calls and text messages to a team of hydrologists initiating a pro-active response to prevent upcoming flood damage. The flood forecasting system of the Flemish Environment Agency is constantly evolving and has proven to be an indispensable tool in flood crisis management. This was clearly the case during the November 2010 floods, when the agency issued a press release 2 days in advance allowing water managers, emergency services and civilians to take measures.
Resumo:
As colleges and universities make increasing global engagement a top institutional priority, many have struggled to manage rising levels of international activity. Council research finds that the challenge lies not in convincing faculty to expend more effort but instead in reducing the level of effort required by faculty who are already interested in promoting international activities. This study provides detailed case studies and toolkits for administrative core competencies for increased global engagement. Chapter 2 (page 39) details strategies to promote faculty-led study abroad programs, which constitute the fastest growing study abroad experience. Chapter 5 (page 111) outlines recommendations to build strategic international partnerships that engage the entire campus.
Resumo:
This dissertation evaluates macroeconomic management in Brazil from 1994 to the present, with particular focus on exchange rate policy. It points out that while Brazil's Real Plan succeeded in halting the hyperinflation that had reached more than 2000 percent in 1993, it also caused significant real appreciation of the exchange rate situation that was only made worse by the extremely high interest rates and ensuing bout of severe financial crises in the intemational arena. By the end of 1998, the accumulation of internai and externai imbalances led the authorities to drop foreign exchange controls and allow the currency to float. In spite of some initial scepticism, the flexible rate regime cum inflation target proved to work well. Inflation was kept under control; the current account position improved significantly, real interest rates fell and GDP growth resumed. Thus, while great challenges still lie ahead, the recent successes bestow some optimism on the well functioning of this exchange rate regime. The Brazilian case suggests that successful transition from one foreign exchange system to another, particularly during financial crisis, does not depend only on one variable be it fiscal or monetary. In reality, it depends on whole set of co-ordinated policies aimed at resuming price stability with as little exchange rate and output volatility as possible.
Resumo:
The presence of inflation has induced the financial institutions to implement procedures devised to protect the real values of theirs loans. Two of such procedurcs, the floaaing rale scheme and the monetary correction mechanism, tend to lead to very different streams of payments. However, whenever the floating rate scheme follows the rule of Strict adhercnce to lhe Fisher equation, lhe two procedures are financially equivalent.
Resumo:
This paper examined the transmission mechanism of international prices of agricultural commodities into the real exchange rate in Brazil for the period from January 2000 to February 2010. We used time series models (ARIMA Model, Transfer Model, Intervention Analysis, Johansen Cointegration Test) in determination of the short and long run elasticities. Transfer Function Model results show that changes in international prices of agricultural commodities are transmitted to the real exchange rate in Brazil in the short run, however, that transmission is less than unity, thus configuring the inelastic relationship. Johansen cointegration tests show that these variables are not co-integrated, no longer converge to the long-run equilibrium. These results are in agreement Cashim et al. (2004), which also found no long run relationship between real exchange rate and commodity prices in the case of Brazil. These results show that monetary shocks have greater weight on changes of the real exchange rate than real shocks.