847 resultados para REAL INTEREST-RATE


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Property ownership can tie up large amounts of capital and management energy that business could employ more productively elsewhere. Competitive pressures, accounting changes and increasingly sophisticated occupier requirements are building demand for new and innovative ways to satisfy corporate occupation needs. The investment climate is also changing. Falling interest rates and falling inflation can be expected to undermine returns from the traditional FRI lease. In future, investment returns will be more dependent on active and innovative management geared to the needs of occupiers on whom income depends. Occupier and investor interests, therefore, look set to coincide, but unlocking the potential for both parties will depend on developing new finance and investment vehicles that align their respective needs. In the UK, examples include PFI in the public sector and off-balance sheet financing in the private sector. In the USA, “synthetic lease” structures have also become popular. Growing investment market experience in assessing risks and returns suggests scope for further innovative arrangements in the corporate sector. But how can such arrangements be structured? What are the risks, drivers and barriers?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real estate depreciation continues to be a critical issue for investors and the appraisal profession in the UK in the 1990s. Depreciation-sensitive cash flow models have been developed, but there is a real need to develop further empirical methodologies to determine rental depreciation rates for input into these models. Although building quality has been found to be an important explanatory variable in depreciation it is very difficult to incorporate it into such models or to analyse it retrospectively. It is essential to examine previous depreciation research from real estate and economics in the USA and UK to understand the issues in constructing a valid and pragmatic way of calculating rental depreciation. Distinguishing between 'depreciation' and 'obsolescence' is important, and the pattern of depreciation in any study can be influenced by such factors as the type (longitudinal or crosssectional) and timing of the study, and the market state. Longitudinal studies can analyse change more directly than cross-sectional studies. Any methodology for calculating rental depreciation rate should be formulated in the context of such issues as 'censored sample bias', 'lemons' and 'filtering', which have been highlighted in key US literature from the field of economic depreciation. Property depreciation studies in the UK have tended to overlook this literature, however. Although data limitations and constraints reduce the ability of empirical property depreciation work in the UK to consider these issues fully, 'averaging' techniques and ordinary least squares (OLS) regression can both provide a consistent way of calculating rental depreciation rates within a 'cohort' framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sea ice friction models are necessary to predict the nature of interactions between sea ice floes. These interactions are of interest on a range of scales, for example, to predict loads on engineering structures in icy waters or to understand the basin-scale motion of sea ice. Many models use Amonton's friction law due to its simplicity. More advanced models allow for hydrodynamic lubrication and refreezing of asperities; however, modeling these processes leads to greatly increased complexity. In this paper we propose, by analogy with rock physics, that a rate- and state-dependent friction law allows us to incorporate memory (and thus the effects of lubrication and bonding) into ice friction models without a great increase in complexity. We support this proposal with experimental data on both the laboratory (∼0.1 m) and ice tank (∼1 m) scale. These experiments show that the effects of static contact under normal load can be incorporated into a friction model. We find the parameters for a first-order rate and state model to be A = 0.310, B = 0.382, and μ0 = 0.872. Such a model then allows us to make predictions about the nature of memory effects in moving ice-ice contacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A flood warning system incorporates telemetered rainfall and flow/water level data measured at various locations in the catchment area. Real-time accurate data collection is required for this use, and sensor networks improve the system capabilities. However, existing sensor nodes struggle to satisfy the hydrological requirements in terms of autonomy, sensor hardware compatibility, reliability and long-range communication. We describe the design and development of a real-time measurement system for flood monitoring, and its deployment in a flash-flood prone 650 km2 semiarid watershed in Southern Spain. A developed low-power and long-range communication device, so-called DatalogV1, provides automatic data gathering and reliable transmission. DatalogV1 incorporates self-monitoring for adapting measurement schedules for consumption management and to capture events of interest. Two tests are used to assess the success of the development. The results show an autonomous and robust monitoring system for long-term collection of water level data in many sparse locations during flood events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are looking into variants of a domination set problem in social networks. While randomised algorithms for solving the minimum weighted domination set problem and the minimum alpha and alpha-rate domination problem on simple graphs are already present in the literature, we propose here a randomised algorithm for the minimum weighted alpha-rate domination set problem which is, to the best of our knowledge, the first such algorithm. A theoretical approximation bound based on a simple randomised rounding technique is given. The algorithm is implemented in Python and applied to a UK Twitter mentions networks using a measure of individuals’ influence (klout) as weights. We argue that the weights of vertices could be interpreted as the costs of getting those individuals on board for a campaign or a behaviour change intervention. The minimum weighted alpha-rate dominating set problem can therefore be seen as finding a set that minimises the total cost and each individual in a network has at least alpha percentage of its neighbours in the chosen set. We also test our algorithm on generated graphs with several thousand vertices and edges. Our results on this real-life Twitter networks and generated graphs show that the implementation is reasonably efficient and thus can be used for real-life applications when creating social network based interventions, designing social media campaigns and potentially improving users’ social media experience.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the impact of foreign real estate investment on the US office market capitalization rates. The geographic unit of analysis is MSA and the time period is 2001-2013. Drawing upon a database of commercial real estate transactions provided by Real Capital Analytics, we model the determinants of market capitalization rates with a particular focus on the significance of the proportion of market transactions involving foreign investors. We have employed several econometric techniques to explore the data, potential estimation biases, and test robustness of the results. The results suggest statistically significant effects of foreign investment across 38 US metro areas. It is estimated that, all else equal, a 100 basis points increase in foreign share of total investment in a US metropolitan office market causes about an 8 basis points decrease in the market cap rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real estate securities have a number of distinct characteristics that differentiate them from stocks generally. Key amongst them is that under-pinning the firms are both real as well as investment assets. The connections between the underlying macro-economy and listed real estate firms is therefore clearly demonstrated and of heightened importance. To consider the linkages with the underlying macro-economic fundamentals we extract the ‘low-frequency’ volatility component from aggregate volatility shocks in 11 international markets over the 1990-2014 period. This is achieved using Engle and Rangel’s (2008) Spline-Generalized Autoregressive Conditional Heteroskedasticity (Spline-GARCH) model. The estimated low-frequency volatility is then examined together with low-frequency macro data in a fixed-effect pooled regression framework. The analysis reveals that the low-frequency volatility of real estate securities has strong and positive association with most of the macroeconomic risk proxies examined. These include interest rates, inflation, GDP and foreign exchange rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

his article examines the impact of foreign real estate investment on U.S. office market capitalization rates. The geographic unit of analysis is MSA and the time period is 2001–2013. Drawing upon a database of commercial real estate transactions, the authors model the determinants of market capitalization rates with a particular focus on the significance of the proportion of market transactions involving foreign investors. Employing several econometric techniques to analyze the data, the results suggest statistically significant effects of foreign investment across 38 U.S. metro areas. It is estimated that, all else equal, a 100 basis point increase in foreign share of total investment in a U.S. metropolitan office market causes about an 8 basis point decrease in the market cap rate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we develop a flexible cure rate survival model by assuming the number of competing causes of the event of interest to follow a compound weighted Poisson distribution. This model is more flexible in terms of dispersion than the promotion time cure model. Moreover, it gives an interesting and realistic interpretation of the biological mechanism of the occurrence of event of interest as it includes a destructive process of the initial risk factors in a competitive scenario. In other words, what is recorded is only from the undamaged portion of the original number of risk factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we deal with a Bayesian analysis for right-censored survival data suitable for populations with a cure rate. We consider a cure rate model based on the negative binomial distribution, encompassing as a special case the promotion time cure model. Bayesian analysis is based on Markov chain Monte Carlo (MCMC) methods. We also present some discussion on model selection and an illustration with a real dataset.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we extend the long-term survival model proposed by Chen et al. [Chen, M.-H., Ibrahim, J.G., Sinha, D., 1999. A new Bayesian model for survival data with a surviving fraction. journal of the American Statistical Association 94, 909-919] via the generating function of a real sequence introduced by Feller [Feller, W., 1968. An Introduction to Probability Theory and its Applications, third ed., vol. 1, Wiley, New York]. A direct consequence of this new formulation is the unification of the long-term survival models proposed by Berkson and Gage [Berkson, J., Gage, R.P., 1952. Survival cure for cancer patients following treatment. journal of the American Statistical Association 47, 501-515] and Chen et al. (see citation above). Also, we show that the long-term survival function formulated in this paper satisfies the proportional hazards property if, and only if, the number of competing causes related to the occurrence of an event of interest follows a Poisson distribution. Furthermore, a more flexible model than the one proposed by Yin and Ibrahim [Yin, G., Ibrahim, J.G., 2005. Cure rate models: A unified approach. The Canadian journal of Statistics 33, 559-570] is introduced and, motivated by Feller`s results, a very useful competing index is defined. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wooden railway sleeper inspections in Sweden are currently performed manually by a human operator; such inspections are based on visual analysis. Machine vision based approach has been done to emulate the visual abilities of human operator to enable automation of the process. Through this process bad sleepers are identified, and a spot is marked on it with specific color (blue in the current case) on the rail so that the maintenance operators are able to identify the spot and replace the sleeper. The motive of this thesis is to help the operators to identify those sleepers which are marked by color (spots), using an “Intelligent Vehicle” which is capable of running on the track. Capturing video while running on the track and segmenting the object of interest (spot) through this vehicle; we can automate this work and minimize the human intuitions. The video acquisition process depends on camera position and source light to obtain fine brightness in acquisition, we have tested 4 different types of combinations (camera position and source light) here to record the video and test the validity of proposed method. A sequence of real time rail frames are extracted from these videos and further processing (depending upon the data acquisition process) is done to identify the spots. After identification of spot each frame is divided in to 9 regions to know the particular region where the spot lies to avoid overlapping with noise, and so on. The proposed method will generate the information regarding in which region the spot lies, based on nine regions in each frame. From the generated results we have made some classification regarding data collection techniques, efficiency, time and speed. In this report, extensive experiments using image sequences from particular camera are reported and the experiments were done using intelligent vehicle as well as test vehicle and the results shows that we have achieved 95% success in identifying the spots when we use video as it is, in other method were we can skip some frames in pre-processing to increase the speed of video but the segmentation results we reduced to 85% and the time was very less compared to previous one. This shows the validity of proposed method in identification of spots lying on wooden railway sleepers where we can compromise between time and efficiency to get the desired result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The article reviews the modelling of District Metered Areas (DMAs) with relatively high leakage rate. As a generally recognised approach in modelling of leakage does not exist, modelling of leakage by enginners and other researchers usually takes place by dividing the whole leakage rate evenly to all available nodes of the model. In this article, a new methodology is proposed to determine the nodal leakage by using a hydraulic model. The proposed methodology takes into consideration the IWA water balance methodology, the Minimum Night Flow (MNF) analysis, the number of connections related to each node and the marerial of pipes. In addition, the model is illustrated by a real case study, as it was applied in Kalipoli’s DMA. Results show that the proposed model gives reliable results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Exchange Rate is the Most Strategic of the Four Macroeconomic Prices. it Determines not Only Exports and Imports, But Also Real Wages, Consumption and the Savings Rate. Conventional Theory Holds That it is Impossible to Manage It, and That the Only Alternatives are to Fix or to Float It. the Experience of the East Asian Countries, That Use it Strategically, Demonstrates That This Claim is False.