868 resultados para Cost Over run
Resumo:
This paper sets out progress during the first eighteen months of doctoral research into the City of London office market. The overall aim of the research is to explore relationships between office rents and the economy in the UK over the last 150 years. To do this, a database of lettings has been created from which a long run index of City office rents can be constructed. With this index, it should then be possible to analyse trends in rents and relationships with their long run determinants. The focus of this paper is on the creation of the rent database. First, it considers the existing secondary sources of long run rental data for the UK. This highlights a lack of information for years prior to 1970 and the need for primary data collection if earlier periods are to be studied. The paper then discusses the selection of the City of London and of the time period chosen for research. After this, it describes how a dataset covering the period 1860-1960 has been assembled using the records of property companies active in the City office market. It is hoped that, if successful, this research will contribute to existing knowledge on the long run characteristics of commercial real estate. In particular, it should add a price dimension (rents) to the existing long run information on stock/supply and investment. Hence, it should enable a more complete picture of the development and performance of commercial real estate through time to be gained.
Resumo:
Khartoum like many cities in least developing countries (LDCs) still witnesses huge influx of people. Accommodation of the new comers leads to encroachment on the cultivation land leads to sprawl expansion of Greater Khartoum. The city expanded in diameter from 16.8 km in 1955 to 802.5 km in 1998. Most of this horizontal expansion was residential. In 2008 Khartoum accommodated 29% of the urban population of Sudan. Today Khartoum is considered as one of 43 major cities in Africa that accommodates more than 1 million inhabitants. Most of new comers live in the outskirts of the city e.g. Dar El-Salam and Mayo neighbourhoods. The majority of those new comers built their houses especially the walls from mud, wood, straw and sacks. Selection of building materials usually depends on its price regardless of the environmental impact, quality, thermal performance and life of the material. Most of the time, this results in increasing the cost with variables of impacts over the environment during the life of the building. Therefore, consideration of the environmental impacts, social impacts and economic impacts is crucial in the selection of any building material. Decreasing such impacts could lead to more sustainable housing. Comparing the sustainability of the available wall building materials for low cost housing in Khartoum is carried out through the life cycle assessment (LCA) technique. The purpose of this paper is to compare the most available local building materials for walls for the urban poor of Khartoum from a sustainability point of view by going through the manufacturing of the materials, the use of these materials and then the disposal of the materials after their life comes to an end. Findings reveal that traditional red bricks couldn’t be considered as a sustainable wall building material that will draw the future of the low cost housing in Greater Khartoum. On the other hand, results of the comparison lead to draw attention to the wide range of the soil techniques and to its potentials to be a promising sustainable wall material for urban low cost housing in Khartoum.
Resumo:
Purpose – Commercial real estate is a highly specific asset: heterogeneous, indivisible and with less information transparency than most other commonly held investment assets. These attributes encourage the use of intermediaries during asset acquisition and disposal. However, there are few attempts to explain the use of different brokerage models (with differing costs) in different markets. This study aims to address this gap. Design/methodology/approach – The study analyses 9,338 real estate transactions in London and New York City from 2001 to 2011. Data are provided by Real Capital Analytics and cover over $450 billion of investments in this period. Brokerage trends in the two cities are compared and probit regressions are used to test whether the decision to transact with broker representation varies with investor or asset characteristics. Findings – Results indicate greater use of brokerage in London, especially by purchasers. This persists when data are disaggregated by sector, time or investor type, pointing to the role of local market culture and institutions in shaping brokerage models and transaction costs. Within each city, the nature of the investors involved seems to be a more significant influence on broker use than the characteristics of the assets being traded. Originality/value – Brokerage costs are the single largest non-tax charge to an investor when trading commercial real estate, yet there is little research in this area. This study examines the role of brokers and provides empirical evidence on factors that influence the use and mode of brokerage in two major investment destinations.
Resumo:
Under particular large-scale atmospheric conditions, several windstorms may affect Europe within a short time period. The occurrence of such cyclone families leads to large socioeconomic impacts and cumulative losses. The serial clustering of windstorms is analyzed for the North Atlantic/western Europe. Clustering is quantified as the dispersion (ratio variance/mean) of cyclone passages over a certain area. Dispersion statistics are derived for three reanalysis data sets and a 20-run European Centre Hamburg Version 5 /Max Planck Institute Version–Ocean Model Version 1 global climate model (ECHAM5/MPI-OM1 GCM) ensemble. The dependence of the seriality on cyclone intensity is analyzed. Confirming previous studies, serial clustering is identified in reanalysis data sets primarily on both flanks and downstream regions of the North Atlantic storm track. This pattern is a robust feature in the reanalysis data sets. For the whole area, extreme cyclones cluster more than nonextreme cyclones. The ECHAM5/MPI-OM1 GCM is generally able to reproduce the spatial patterns of clustering under recent climate conditions, but some biases are identified. Under future climate conditions (A1B scenario), the GCM ensemble indicates that serial clustering may decrease over the North Atlantic storm track area and parts of western Europe. This decrease is associated with an extension of the polar jet toward Europe, which implies a tendency to a more regular occurrence of cyclones over parts of the North Atlantic Basin poleward of 50°N and western Europe. An increase of clustering of cyclones is projected south of Newfoundland. The detected shifts imply a change in the risk of occurrence of cumulative events over Europe under future climate conditions.
Resumo:
With a wide range of applications benefiting from dense network air temperature observations but with limitations of costs, existing siting guidelines and risk of damage to sensors, new methods are required to gain a high resolution understanding of the spatio-temporal patterns of urban meteorological phenomena such as the urban heat island or precision farming needs. With the launch of a new generation of low cost sensors it is possible to deploy a network to monitor air temperature at finer spatial resolutions. Here we investigate the Aginova Sentinel Micro (ASM) sensor with a bespoke radiation shield (together < US$150) which can provide secure near-real-time air temperature data to a server utilising existing (or user deployed) Wireless Fidelity (Wi-Fi) networks. This makes it ideally suited for deployment where wireless communications readily exist, notably urban areas. Assessment of the performance of the ASM relative to traceable standards in a water bath and atmospheric chamber show it to have good measurement accuracy with mean errors < ± 0.22 °C between -25 and 30 °C, with a time constant in ambient air of 110 ± 15 s. Subsequent field tests of it within the bespoke shield also had excellent performance (root-mean-square error = 0.13 °C) over a range of meteorological conditions relative to a traceable operational UK Met Office platinum resistance thermometer. These results indicate that the ASM and bespoke shield are more than fit-for-purpose for dense network deployment in urban areas at relatively low cost compared to existing observation techniques.
Resumo:
A fully susceptible genotype (4106A) of Myzus persicae survived the longest on an artificial diet and, in several of the eight replicates, monitoring was terminated when the culture was still thriving. A genotype with elevated carboxylesterase FE4 at the R3 level (800F) had a mean survival of only 98.13 days, whereas 794J, which combines R3 E4 carboxylesterase with target-site resistance (knockdown resistance), survived for the even shorter mean time of 84.38 days. The poorer survival of the two genotypes with extremely elevated carboxylesterase-resistance was not the result of a reluctance to transfer to new diet at each diet change. Although available for only two replicates, a revertant clone of 794J (794Jrev), which has the same genotype as 794J but the amplified E4 genes are not expressed leading to a fully susceptible phenotype, did not appear to survive any better than this clone. This suggests that the poor survival on an artificial diet of the extreme-carboxylesterase genotypes is not the result of the cost of over-producing the enzyme. The frequency of insecticide-resistant genotypes is low in the population until insecticide is applied, indicating that they have reduced fitness, although this does not necessarily reflect a direct cost of expressing the resistance mechanism.
Resumo:
This study examines the long-run performance of initial public offerings on the Stock Exchange of Mauritius (SEM). The results show that the 3-year equally weighted cumulative adjusted returns average −16.5%. The magnitude of this underperformance is consistent with most reported studies in different developed and emerging markets. Based on multivariate regression models, firms with small issues and higher ex ante financial strength seem on average to experience greater long-run underperformance, supporting the divergence of opinion and overreaction hypotheses. On the other hand, Mauritian firms do not on average time their offerings to lower cost of capital and as such, there seems to be limited support for the windows of opportunity hypothesis.
Resumo:
Our digital universe is rapidly expanding,more and more daily activities are digitally recorded, data arrives in streams, it needs to be analyzed in real time and may evolve over time. In the last decade many adaptive learning algorithms and prediction systems, which can automatically update themselves with the new incoming data, have been developed. The majority of those algorithms focus on improving the predictive performance and assume that model update is always desired as soon as possible and as frequently as possible. In this study we consider potential model update as an investment decision, which, as in the financial markets, should be taken only if a certain return on investment is expected. We introduce and motivate a new research problem for data streams ? cost-sensitive adaptation. We propose a reference framework for analyzing adaptation strategies in terms of costs and benefits. Our framework allows to characterize and decompose the costs of model updates, and to asses and interpret the gains in performance due to model adaptation for a given learning algorithm on a given prediction task. Our proof-of-concept experiment demonstrates how the framework can aid in analyzing and managing adaptation decisions in the chemical industry.
Resumo:
We present results from experimental price-setting oligopolies in which green firms undertake different levels of energy-saving investments motivated by public subsidies and demand-side advantages. We find that consumers reveal higher willingness to pay for greener sellers’ products. This observation in conjunction to the fact that greener sellers set higher prices is compatible with the use and interpretation of energy-saving behaviour as a differentiation strategy. However, sellers do not exploit the resulting advantage through sufficiently high price-cost margins, because they seem trapped into “run to stay still” competition. Regarding the use of public subsidies to energy-saving sellers we uncover an undesirable crowding-out effect of consumers’ intrinsic tendency to support green manufacturers. Namely, consumers may be less willing to support a green seller whose energy-saving strategy yields a direct financial benefit. Finally, we disentangle two alternative motivations for consumer’s attractions to pro-social firms; first, the self-interested recognition of the firm’s contribution to the public and private welfare and, second, the need to compensate a firm for the cost entailed in each pro-social action. Our results show the prevalence of the former over the latter.
Resumo:
Over the past 30 years, cost–benefit analysis (CBA) has been applied to various areas of public policies and projects. The aim of this essay is to describe the origins of CBA, classify typologies of costs and benefits, define efficiency under CBA and discuss issues associated with the use of a microeconomic tool in macroeconomic contexts.
Resumo:
A FTC-DOJ study argues that state laws and regulations may inhibit the unbundling of real estate brokerage services in response to new technology. Our data show that 18 states have changed laws in ways that promote unbundling since 2000. We model brokerage costs as measured by number of agents in a state-level annual panel vector autoregressive framework, a novel way of analyzing wasteful competition. Our findings support a positive relationship between brokerage costs and lagged house price and transactions. We find that change in full-service brokers responds negatively (by well over two percentage points per year) to legal changes facilitating unbundling
Resumo:
As part of an international intercomparison project, a set of single column models (SCMs) and cloud-resolving models (CRMs) are run under the weak temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistent implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.
Resumo:
The predictability of high impact weather events on multiple time scales is a crucial issue both in scientific and socio-economic terms. In this study, a statistical-dynamical downscaling (SDD) approach is applied to an ensemble of decadal hindcasts obtained with the Max-Planck-Institute Earth System Model (MPI-ESM) to estimate the decadal predictability of peak wind speeds (as a proxy for gusts) over Europe. Yearly initialized decadal ensemble simulations with ten members are investigated for the period 1979–2005. The SDD approach is trained with COSMO-CLM regional climate model simulations and ERA-Interim reanalysis data and applied to the MPI-ESM hindcasts. The simulations for the period 1990–1993, which was characterized by several windstorm clusters, are analyzed in detail. The anomalies of the 95 % peak wind quantile of the MPI-ESM hindcasts are in line with the positive anomalies in reanalysis data for this period. To evaluate both the skill of the decadal predictability system and the added value of the downscaling approach, quantile verification skill scores are calculated for both the MPI-ESM large-scale wind speeds and the SDD simulated regional peak winds. Skill scores are predominantly positive for the decadal predictability system, with the highest values for short lead times and for (peak) wind speeds equal or above the 75 % quantile. This provides evidence that the analyzed hindcasts and the downscaling technique are suitable for estimating wind and peak wind speeds over Central Europe on decadal time scales. The skill scores for SDD simulated peak winds are slightly lower than those for large-scale wind speeds. This behavior can be largely attributed to the fact that peak winds are a proxy for gusts, and thus have a higher variability than wind speeds. The introduced cost-efficient downscaling technique has the advantage of estimating not only wind speeds but also estimates peak winds (a proxy for gusts) and can be easily applied to large ensemble datasets like operational decadal prediction systems.
Resumo:
The regional climate modelling system PRECIS, was run at 25 km horizontal resolution for 150 years (1949-2099) using global driving data from a five member perturbed physics ensemble (based on the coupled global climate model HadCM3). Output from these simulations was used to investigate projected changes in tropical cyclones (TCs) over Vietnam and the South China Sea due to global warming (under SRES scenario A1B). Thirty year climatological mean periods were used to look at projected changes in future (2069-2098) TCs compared to a 1961-1990 baseline. Present day results were compared qualitatively with IBTrACS observations and found to be reasonably realistic. Future projections show a 20-44 % decrease in TC frequency, although the spatial patterns of change differ between the ensemble members, and an increase of 27-53 % in the amount of TC associated precipitation. No statistically significant changes in TC intensity were found, however, the occurrence of more intense TCs (defined as those with a maximum 10 m wind speed > 35 m/s) was found to increase by 3-9 %. Projected increases in TC associated precipitation are likely caused by increased evaporation and availability of atmospheric water vapour, due to increased sea surface and atmospheric temperature. The mechanisms behind the projected changes in TC frequency are difficult to link explicitly; changes are most likely due to the combination of increased static stability, increased vertical wind shear and decreased upward motion, which suggest a decrease in the tropical overturning circulation.
Resumo:
Leaf fibers are fibers that run lengthwise through the leaves of most monocotyledonous plants such as pineapple, banana, etc. Pineapple (Ananas comosus) and Banana (Musa indica) are emerging fiber having a very large potential to be used for composite materials. Over 150,000 ha of pineapple and over 100,000 ha of banana plantations are available in Brazil for the fruit production and enormous amount of agricultural waste is produced. This residual waste represents one of the single largest sources of cellulose fibers available at almost no cost. The potential consumers for this fiber are pulp and paper, chemical feedstock, textiles and composites for the automotive, furniture and civil construction industry.