890 resultados para Proxy
Resumo:
In enclosed shopping centres, stores benefit from the positive externalities of other stores in the centre. Some stores provide greater benefits to their neighbours than others – for example anchor tenants and brand leading stores. In managing shopping centres, these positive externalities might be captured through rental variations. This paper explores the determinants of rent – including externalities – for UK regional shopping centres. Two linked databases were utilised in the research. One contains characteristics of 148 shopping centres; the other has some 1,930 individual tenant records including rent level. These data were analysed to provide information on the characteristics of centres and retailers that help determine rent. Factors influencing tenant rents include market potential factors derived from urban and regional economic theory and shopping centre characteristics identified in prior retail research. The model also includes variables that proxy for the interaction between tenants and the impact of positive in-centre externalities. We find that store size is significantly and negatively related to tenant with both anchor and other larger tenants, perhaps as a result of the positive effects generated by their presence, paying relatively lower rents while smaller stores, benefiting from the generation of demand, pay relatively higher rents. Brand leader tenants pay lower rents than other tenants within individual retail categories.
Resumo:
Uranium series dating has been carried out on secondary uranyl silicate minerals formed during sub-glacial and post-glacial weathering of Proterozoic uraninite ores in south west Finland. The samples were obtained from two sites adjacent to the Salpauselkä III ice marginal formation and cover a range of depths, from the surface to more than 60 m. Measured ages fall into three distinct groups, 70–100 ka, 28–36 ka and < 2500 yr. The youngest set is associated with surface exposures and the crystals display clear evidence of re-working. The most likely trigger for uranium release at depths below the surface weathering zone is intrusion of oxidising glacial melt water. The latter is often characterised by very high discharge rates along channels, which close once the overpressure generated at the ice margin is released. There is excellent correspondence between the two Finnish sites and published data for similar deposits over a large area of southern and central Sweden. None of the seventy samples analysed gave a U–Th age between 40 and 70 ka; a second hiatus is apparent at 20 ka, coinciding with the Last Glacial Maximum. Thus, the process responsible for uranyl silicate formation was halted for significant periods, owing to a change in geochemical conditions or the hydrogeological regime. These data support the presence of interstadial conditions during the Early and Middle Weichselian since in the absence of major climatic perturbations the uranium phases at depth are stable. When viewed in conjunction with proxy data from mammoth remains it would appear that the region was ice-free prior to the Last Glacial Maximum.
Resumo:
As many as fourteen US states have now mandated minimum service requirements for real estate brokerage relationships in residential transactions. This study attempts to determine whether these minimum service laws have any impact on brokerage competition. Federal government agencies allege such laws discourage competition because they limit the offering of nontraditional brokerage services. However, alternatively, a legislative “bright line” definition of the lowest level of acceptable service may reduce any perceived risk in offering non-traditional brokerage services and therefore encourage competition. Using several empirical strategies and state-level data over nine years (2000-08), we do not find any consistent and significant impact (positive/negative) of minimum services laws on number of licensees per 100 households, our proxy for competition. Interestingly, we also find that association strength, as measured by Realtor association membership penetration, has a strong deterring effect on competition.
Resumo:
A detailed analysis is presented of solar UV spectral irradiance for the period between May 2003 and August 2005, when data are available from both the Solar Ultraviolet pectral Irradiance Monitor (SUSIM) instrument (on board the pper Atmosphere Research Satellite (UARS) spacecraft) and the Solar Stellar Irradiance Comparison Experiment (SOLSTICE) instrument (on board the Solar Radiation and Climate Experiment (SORCE) satellite). The ultimate aim is to develop a data composite that can be used to accurately determine any differences between the “exceptional” solar minimum at the end of solar cycle 23 and the previous minimum at the end of solar cycle 22 without having to rely on proxy data to set the long‐term change. SUSIM data are studied because they are the only data available in the “SOLSTICE gap” between the end of available UARS SOLSTICE data and the start of the SORCE data. At any one wavelength the two data sets are considered too dissimilar to be combined into a meaningful composite if any one of three correlations does not exceed a threshold of 0.8. This criterion removes all wavelengths except those in a small range between 156 nm and 208 nm, the longer wavelengths of which influence ozone production and heating in the lower stratosphere. Eight different methods are employed to intercalibrate the two data sequences. All methods give smaller changes between the minima than are seen when the data are not adjusted; however, correcting the SUSIM data to allow for an exponentially decaying offset drift gives a composite that is largely consistent with the unadjusted data from the SOLSTICE instruments on both UARS and SORCE and in which the recent minimum is consistently lower in the wave band studied.
Resumo:
Pesticide risk indicators provide simple support in the assessment of environmental and health risks from pesticide use, and can therefore inform policies to foster a sustainable interaction of agriculture with the environment. For their relative simplicity, indicators may be particularly useful under conditions of limited data availability and resources, such as in Less Developed Countries (LDCs). However, indicator complexity can vary significantly, in particular between those that rely on an exposure–toxicity ratio (ETR) and those that do not. In addition, pesticide risk indicators are usually developed for Western contexts, which might cause incorrect estimation in LDCs. This study investigated the appropriateness of seven pesticide risk indicators for use in LDCs, with reference to smallholding agriculture in Colombia. Seven farm-level indicators, among which 3 relied on an ETR (POCER, EPRIP, PIRI) and 4 on a non-ETR approach (EIQ, PestScreen, OHRI, Dosemeci et al., 2002), were calculated and then compared by means of the Spearman rank correlation test. Indicators were also compared with respect to key indicator characteristics, i.e. user friendliness and ability to represent the system under study. The comparison of the indicators in terms of the total environmental risk suggests that the indicators not relying on an ETR approach cannot be used as a reliable proxy for more complex, i.e. ETR, indicators. ETR indicators, when user-friendly, show a comparative advantage over non-ETR in best combining the need for a relatively simple tool to be used in contexts of limited data availability and resources, and for a reliable estimation of environmental risk. Non-ETR indicators remain useful and accessible tools to discriminate between different pesticides prior to application. Concerning the human health risk, simple algorithms seem more appropriate for assessing human health risk in LDCs. However, further research on health risk indicators and their validation under LDC conditions is needed.
Resumo:
This paper redefines technical efficiency by incorporating provision of environmental goods as one of the outputs of the farm. The proportion of permanent and rough grassland to total agricultural land area is used as a proxy for the provision of environmental goods. Stochastic frontier analysis was conducted using a Bayesian procedure. The methodology is applied to panel data on 215 dairy farms in England and Wales. Results show that farm efficiency rankings change when provision of environmental outputs by farms is incorporated in the efficiency analysis, which may have important political implications.
Resumo:
A number of transient climate runs simulating the last 120kyr have been carried out using FAMOUS, a fast atmosphere-ocean general circulation model (AOGCM). This is the first time such experiments have been done with a full AOGCM, providing a three-dimensional simulation of both atmosphere and ocean over this period. Our simulation thus includes internally generated temporal variability over periods from days to millennia, and physical, detailed representations of important processes such as clouds and precipitation. Although the model is fast, computational restrictions mean that the rate of change of the forcings has been increased by a factor of 10, making each experiment 12kyr long. Atmospheric greenhouse gases (GHGs), northern hemisphere ice sheets and variations in solar radiation arising from changes in the Earth's orbit are treated as forcing factors, and are applied either separately or combined in different experiments. The long-term temperature changes on Antarctica match well with reconstructions derived from ice-core data, as does variability on timescales longer than 10 kyr. Last Glacial Maximum (LGM) cooling on Greenland is reasonably well simulated, although our simulations, which lack ice-sheet meltwater forcing, do not reproduce the abrupt, millennial scale climate shifts seen in northern hemisphere climate proxies or their slower southern hemisphere counterparts. The spatial pattern of sea surface cooling at the LGM matches proxy reconstructions reasonably well. There is significant anti-correlated variability in the strengths of the Atlantic Meridional Overturning Circulation (AMOC) and the Antarctic Circumpolar Current (ACC) on timescales greater than 10kyr in our experiments. We find that GHG forcing weakens the AMOC and strengthens the ACC, whilst the presence of northern hemisphere ice-sheets strengthens the AMOC and weakens the ACC. The structure of the AMOC at the LGM is found to be sensitive to the details of the ice-sheet reconstruction used. The precessional component of the orbital forcing induces ~20kyr oscillations in the AMOC and ACC, whose amplitude is mediated by changes in the eccentricity of the Earth's orbit. These forcing influences combine, to first order, in a linear fashion to produce the mean climate and ocean variability seen in the run with all forcings.
Resumo:
During the Last Glacial Maximum (LGM, ∼21,000 years ago) the cold climate was strongly tied to low atmospheric CO2 concentration (∼190 ppm). Although it is generally assumed that this low CO2 was due to an expansion of the oceanic carbon reservoir, simulating the glacial level has remained a challenge especially with the additional δ13C constraint. Indeed the LGM carbon cycle was also characterized by a modern-like δ13C in the atmosphere and a higher surface to deep Atlantic δ13C gradient indicating probable changes in the thermohaline circulation. Here we show with a model of intermediate complexity, that adding three oceanic mechanisms: brine induced stratification, stratification-dependant diffusion and iron fertilization to the standard glacial simulation (which includes sea level drop, temperature change, carbonate compensation and terrestrial carbon release) decreases CO2 down to the glacial value of ∼190 ppm and simultaneously matches glacial atmospheric and oceanic δ13C inferred from proxy data. LGM CO2 and δ13C can at last be successfully reconciled.
High resolution Northern Hemisphere wintertime mid-latitude dynamics during the Last Glacial Maximum
Resumo:
Hourly winter weather of the Last Glacial Maximum (LGM) is simulated using the Community Climate Model version 3 (CCM3) on a globally resolved T170 (75 km) grid. Results are compared to a longer LGM climatological run with the same boundary conditions and monthly saves. Hourly-scale animations are used to enhance interpretations. The purpose of the study is to explore whether additional insights into ice age conditions can be gleaned by going beyond the standard employment of monthly average model statistics to infer ice age weather and climate. Results for both LGM runs indicate a decrease in North Atlantic and increase in North Pacific cyclogenesis. Storm trajectories react to the mechanical forcing of the Laurentide Ice Sheet, with Pacific storms tracking over middle Alaska and northern Canada, terminating in the Labrador Sea. This result is coincident with other model results in also showing a significant reduction in Greenland wintertime precipitation – a response supported by ice core evidence. Higher-temporal resolution puts in sharper focus the close tracking of Pacific storms along the west coast of North America. This response is consistent with increased poleward heat transport in the LGM climatological run and could help explain “early” glacial warming inferred in this region from proxy climate records. Additional analyses shows a large increase in central Asian surface gustiness that support observational inferences that upper-level winds associated with Asian- Pacific storms transported Asian dust to Greenland during the LGM.
Resumo:
The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.
Resumo:
This paper analyses the appraisal of a specialized form of real estate - data centres - that has a unique blend of locational, physical and technological characteristics that differentiate it from conventional real estate assets. Market immaturity, limited trading and a lack of pricing signals enhance levels of appraisal uncertainty and disagreement relative to conventional real estate assets. Given the problems of applying standard discounted cash flow, an approach to appraisal is proposed that uses pricing signals from traded cash flows that are similar to the cash flows generated from data centres. Based upon ‘the law of one price’, it is assumed that two assets that are expected to generate identical cash flows in the future must have the same value now. It is suggested that the expected cash flow of assets should be analysed over the life cycle of the building. Corporate bond yields are used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds.
Resumo:
In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.
Resumo:
A detailed analysis is undertaken of the Atlantic-European climate using data from 500-year-long proxy-based climate reconstructions, a long climate simulation with perpetual 1990 forcing, as well as two global and one regional climate change scenarios. The observed and simulated interannual variability and teleconnectivity are compared and interpreted in order to improve the understanding of natural climate variability on interannual to decadal time scales for the late Holocene. The focus is set on the Atlantic-European and Alpine regions during the winter and summer seasons, using temperature, precipitation, and 500 hPa geopotential height fields. The climate reconstruction shows pronounced interdecadal variations that appear to “lock” the atmospheric circulation in quasi-steady long-term patterns over multi-decadal periods controlling at least part of the temperature and precipitation variability. Different circulation patterns are persistent over several decades for the period 1500 to 1900. The 500-year-long simulation with perpetual 1990 forcing shows some substantial differences, with a more unsteady teleconnectivity behaviour. Two global scenario simulations indicate a transition towards more stable teleconnectivity for the next 100 years. Time series of reconstructed and simulated temperature and precipitation over the Alpine region show comparatively small changes in interannual variability within the time frame considered, with the exception of the summer season, where a substantial increase in interannual variability is simulated by regional climate models.
Resumo:
This study examines the contradictory predictions regarding the association between the premium paid in acquisitions and deal size. We document a robust negative relation between offer premia and target size, indicating that acquirers tend to pay less for large firms, not more. We also find that the overpayment potential is lower in acquisitions of large targets. Yet, they still destroy more value for acquirers around deal announcements, implying that target size may proxy, among others, for the unobserved complexity inherent in large deals. We provide evidence in favor of this interpretation.
Resumo:
Water-table reconstructions from Holocene peatlands are increasingly being used as indicators of terrestrial palaeoclimate in many regions of the world. However, the links between peatland water tables, climate, and long-term peatland development are poorly understood. Here we use a combination of high-resolution proxy climate data and a model of long-term peatland development to examine the relationship between rapid hydrological fluctuations in peatlands and climatic forcing. We show that changes in water-table depth can occur independently of climate forcing. Ecohydrological feedbacks inherent in peatland development can lead to a degree of homeostasis that partially disconnects peatland water-table behaviour from external climatic influences. We conclude by suggesting that further work needs to be done before peat-based climate reconstructions can be used to test climate models.