38 resultados para Retention Thresholds
Resumo:
The retention rate of a company has an impact on its earnings and dividend growth. Lease structures and performance measurement practice force real estate investment managers to adopt full distribution policies. Does this lead to lower income growth in real estate? This paper examines several European office markets across which the effective retention rates vary. It then compares depreciation rates across these markets. It is concluded that there is evidence of a relationship between retention and depreciation. Those markets with particularly inflexible lease structures exhibit low retention rates and higher levels of rental value depreciation. This poses interesting questions concerning the appropriate way to measure property performance across markets exhibiting significantly different retention rates and also raises important issues for global investors.
Resumo:
Measuring the retention, or residence time, of dosage forms to biological tissue is commonly a qualitative measurement, where no real values to describe the retention can be recorded. The result of this is an assessment that is dependent upon a user's interpretation of visual observation. This research paper outlines the development of a methodology to quantitatively measure, both by image analysis and by spectrophotometric techniques, the retention of material to biological tissues, using the retention of polymer solutions to ocular tissue as an example. Both methods have been shown to be repeatable, with the spectrophotometric measurement generating data reliably and quickly for further analysis.
Resumo:
Diffuse pollution, and the contribution from agriculture in particular, has become increasingly important as pollution from point sources has been addressed by wastewater treatment. Land management approaches, such as construction of field wetlands, provide one group of mitigation options available to farmers. Although field wetlands are widely used for diffuse pollution control in temperate environments worldwide, there is a shortage of evidence for the effectiveness and viability of these mitigation options in the UK. The Mitigation Options for Phosphorus and Sediment Project aims to make recommendations regarding the design and effectiveness of field wetlands for diffuse pollution control in UK landscapes. Ten wetlands have been built on four farms in Cumbria and Leicestershire. This paper focuses on sediment retention within the wetlands, estimated from annual sediment surveys in the first two years, and discusses establishment costs. It is clear that the wetlands are effective in trapping a substantial amount of sediment. Estimates of annual sediment retention suggest higher trapping rates at sandy sites (0.5–6 t ha�1 yr�1), compared to silty sites (0.02–0.4 t ha�1 yr�1) and clay sites (0.01–0.07 t ha�1 yr�1). Establishment costs for the wetlands ranged from £280 to £3100 and depended more on site specific factors, such as fencing and gateways on livestock farms, rather than on wetland size or design. Wetlands with lower trapping rates would also have lower maintenance costs, as dredging would be required less frequently. The results indicate that field wetlands show promise for inclusion in agri-environment schemes, particularly if capital payments can be provided for establishment, to encourage uptake of these multi-functional features.
Resumo:
Records of Atlantic basin tropical cyclones (TCs) since the late nineteenth century indicate a very large upward trend in storm frequency. This increase in documented TCs has been previously interpreted as resulting from anthropogenic climate change. However, improvements in observing and recording practices provide an alternative interpretation for these changes: recent studies suggest that the number of potentially missed TCs is sufficient to explain a large part of the recorded increase in TC counts. This study explores the influence of another factor—TC duration—on observed changes in TC frequency, using a widely used Atlantic hurricane database (HURDAT). It is found that the occurrence of short-lived storms (duration of 2 days or less) in the database has increased dramatically, from less than one per year in the late nineteenth–early twentieth century to about five per year since about 2000, while medium- to long-lived storms have increased little, if at all. Thus, the previously documented increase in total TC frequency since the late nineteenth century in the database is primarily due to an increase in very short-lived TCs. The authors also undertake a sampling study based upon the distribution of ship observations, which provides quantitative estimates of the frequency of missed TCs, focusing just on the moderate to long-lived systems with durations exceeding 2 days in the raw HURDAT. Upon adding the estimated numbers of missed TCs, the time series of moderate to long-lived Atlantic TCs show substantial multidecadal variability, but neither time series exhibits a significant trend since the late nineteenth century, with a nominal decrease in the adjusted time series. Thus, to understand the source of the century-scale increase in Atlantic TC counts in HURDAT, one must explain the relatively monotonic increase in very short-duration storms since the late nineteenth century. While it is possible that the recorded increase in short-duration TCs represents a real climate signal, the authors consider that it is more plausible that the increase arises primarily from improvements in the quantity and quality of observations, along with enhanced interpretation techniques. These have allowed National Hurricane Center forecasters to better monitor and detect initial TC formation, and thus incorporate increasing numbers of very short-lived systems into the TC database.
Resumo:
The redistribution of a finite amount of martian surface dust during global dust storms and in the intervening periods has been modelled in a dust lifting version of the UK Mars General Circulation Model. When using a constant, uniform threshold in the model’s wind stress lifting parameterisation and assuming an unlimited supply of surface dust, multiannual simulations displayed some variability in dust lifting activity from year to year, arising from internal variability manifested in surface wind stress, but dust storms were limited in size and formed within a relatively short seasonal window. Lifting thresholds were then allowed to vary at each model gridpoint, dependent on the rates of emission or deposition of dust. This enhanced interannual variability in dust storm magnitude and timing, such that model storms covered most of the observed ranges in size and initiation date within a single multiannual simulation. Peak storm magnitude in a given year was primarily determined by the availability of surface dust at a number of key sites in the southern hemisphere. The observed global dust storm (GDS) frequency of roughly one in every 3 years was approximately reproduced, but the model failed to generate these GDSs spontaneously in the southern hemisphere, where they have typically been observed to initiate. After several years of simulation, the surface threshold field—a proxy for net change in surface dust density—showed good qualitative agreement with the observed pattern of martian surface dust cover. The model produced a net northward cross-equatorial dust mass flux, which necessitated the addition of an artificial threshold decrease rate in order to allow the continued generation of dust storms over the course of a multiannual simulation. At standard model resolution, for the southward mass flux due to cross-equatorial flushing storms to offset the northward flux due to GDSs on a timescale of ∼3 years would require an increase in the former by a factor of 3–4. Results at higher model resolution and uncertainties in dust vertical profiles mean that quasi-periodic redistribution of dust on such a timescale nevertheless appears to be a plausible explanation for the observed GDS frequency.
Resumo:
Dissolved organic carbon (DOC) in acid-sensitive upland waters is dominated by allochthonous inputs from organic-rich soils, yet inter-site variability in soil DOC release to changes in acidity has received scant attention in spite of the reported differences between locations in surface water DOC trends over the last few decades. In a previous paper, we demonstrated that pH-related retention of DOC in O horizon soils was influenced by acid-base status, particularly the exchangeable Al content. In the present paper, we investigate the effect of sulphate additions (0–437 μeq l−1) on DOC release in the mineral B horizon soils from the same locations. Dissolved organic carbon release decreased with declining pH in all soils, although the shape of the pH-DOC relationships differed between locations, reflecting the multiple factors controlling DOC mobility. The release of DOC decreased by 32–91% in the treatment with the largest acid input (437 μeq l−1), with the greatest decreases occurring in soils with very small % base saturation (BS, <3%) and/or large capacity for sulphate (SO42−) retention (up to 35% of added SO42−). The greatest DOC release occurred in the soil with the largest initial base status (12% BS). These results support our earlier conclusions that differences in acid-base status between soils alter the sensitivity of DOC release to similar sulphur deposition declines. However,superimposed on this is the capacity of mineral soils to sorb DOC and SO42−, and more work is needed to determine the fate of sorbed DOC under conditions of increasing pH and decreasing SO42−.
Resumo:
An updated empirical approach is proposed for specifying coexistence requirements for genetically modified (GM) maize (Zea mays L.) production to ensure compliance with the 0.9% labeling threshold for food and feed in the European Union. The model improves on a previously published (Gustafson et al., 2006) empirical model by adding recent data sources to supplement the original database and including the following additional cases: (i) more than one GM maize source field adjacent to the conventional or organic field, (ii) the possibility of so-called “stacked” varieties with more than one GM trait, and (iii) lower pollen shed in the non-GM receptor field. These additional factors lead to the possibility for somewhat wider combinations of isolation distance and border rows than required in the original version of the empirical model. For instance, in the very conservative case of a 1-ha square non-GM maize field surrounded on all four sides by homozygous GM maize with 12 m isolation (the effective isolation distance for a single GM field), non-GM border rows of 12 m are required to be 95% confident of gene flow less than 0.9% in the non-GM field (with adventitious presence of 0.3%). Stacked traits of higher GM mass fraction and receptor fields of lower pollen shed would require a greater number of border rows to comply with the 0.9% threshold, and an updated extension to the model is provided to quantify these effects.
Resumo:
Detection of a tactile stimulus on one finger is impaired when a concurrent stimulus (masker) is presented on an additional finger of the same or the opposite hand. This phenomenon is known to be finger-specific at the within-hand level. However, whether this specificity is also maintained at the between-hand level is not known. In four experiments, we addressed this issue by combining a Bayesian adaptive staircase procedure (QUEST) with a two-interval forced choice (2IFC) design in order to establish threshold for detecting 200ms, 100Hz sinusoidal vibrations applied to the index or little fingertip of either hand (targets). We systematically varied the masker finger (index, middle, ring, or little finger of either hand), while controlling the spatial location of the target and masker stimuli. Detection thresholds varied consistently as a function of the masker finger when the latter was on the same hand (Experiments 1 and 2), but not when on different hands (Experiments 3 and 4). Within the hand, detection thresholds increased for masker fingers closest to the target finger (i.e., middle>ring when the target was index). Between the hands, detection thresholds were higher only when the masker was present on any finger as compared to when the target was presented in isolation. The within hand effect of masker finger is consistent with the segregation of different fingers at the early stages of somatosensory processing, from the periphery to the primary somatosensory cortex (SI). We propose that detection is finger-specific and reflects the organisation of somatosensory receptive fields in SI within, but not between the hands.
Resumo:
The application of the Water Framework Directive (WFD) in the European Union (EU) targets certain threshold levels for the concentration of various nutrients, nitrogen and phosphorous being the most important. In the EU, agri-environmental measures constitute a significant component of Pillar 2—Rural Development Policies in both financial and regulatory terms. Environmental measures also are linked to Pillar 1 payments through cross-compliance and the greening proposals. This paper drawing from work carried out in the REFRESH FP7 project aims to show how an INtegrated CAtchment model of plant/soil system dynamics and instream biogeochemical and hydrological dynamics can be used to assess the cost-effectiveness of agri-environmental measures in relation to nutrient concentration targets set by the WFD, especially in the presence of important habitats. We present the procedures (methodological steps, challenges and problems) for assessing the cost-effectiveness of agri-environmental measures at the baseline situation, and climate and land use change scenarios. Furthermore, we present results of an application of this methodology to the Louros watershed in Greece and discuss the likely uses and future extensions of the modelling approach. Finally, we attempt to reveal the importance of this methodology for designing and incorporating alternative environmental practices in Pillar 1 and 2 measures.