942 resultados para SIP Proxy Relay PJSIP Outbound


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines if shell oxygen isotope ratios (d18Oar) of Unio sp. can be used as a proxy of past discharge of the river Meuse. The proxy was developed from a modern dataset for the reference time interval 1997–2007, which showed a logarithmic relationship between discharge and measured water oxygen isotope ratios(d18Ow). To test this relationship for past time intervals,d18Oar values were measured in the aragonite of the growth increments of four Unio sp. shells; two from a relatively wet period and two from a very dry time interval (1910–1918 and 1969–1977, respectively). Shell d18Oar records were converted into d18Ow values using existing water temperature records. Summer d18Ow values, reconstructed from d18Oar of 1910–1918, showed a similar range as the summer d18Ow values for the reference time interval 1997–2007, whilst summer reconstructed d18Ow values for the time interval 1969–1977 were anomalously high. These high d18Ow values suggest that the river Meuse experienced severe summer droughts during the latter time interval. d18Ow values were then applied to calculate discharge values. It was attempted to estimate discharge from the reconstructed d18Ow values using the logarithmic relationship between d18Ow and discharge. A comparison of the calculated summer discharge results with observed discharge data showed that Meuse low-discharge events below a threshold value of 6 m3/s can be detected in the reconstructed d18Ow records, but true quantification remains problematic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The application of oxygen isotope ratios ({delta}18O) from freshwater bivalves as a proxy for river discharge conditions in the Rhine and Meuse rivers is investigated. We compared a dataset of water temperature and water {delta}18O values with a selection of recent shell {delta}18O records for two species of the genus Unio in order to establish: (1) whether differences between the rivers in water {delta}18O values, reflecting river discharge conditions, are recorded in unionid shells; and (2) to what extent ecological parameters influence the accuracy of bivalve shell {delta}18O values as proxies of seasonal, water oxygen isotope conditions in these rivers. The results show that shells from the two rivers differ significantly in {delta}18O values, reflecting different source waters for these two rivers. The seasonal shell {delta}18O records show truncated sinusoidal patterns with narrow peaks and wide troughs, caused by temperature fractionation and winter growth cessation. Interannual growth rate reconstructions show an ontogenetic growth rate decrease. Growth lines in the shell often, but not always, coincide with winter growth cessations in the {delta}18O record, suggesting that growth cessations in the shell {delta}18O records are a better age estimator than counting internal growth lines. Seasonal predicted and measured {delta}18O values correspond well, supporting the hypothesis that these unionids precipitate their shells in oxygen isotopic equilibrium. This means that (sub-) fossil unionids can be used to reconstruct spring-summer river discharge conditions, such as Meuse low-discharge events caused by droughts and Rhine meltwater-influx events caused by melting of snow in the Alps.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatility, or the variability of the underlying asset, is one of the key fundamental components of property derivative pricing and in the application of real option models in development analysis. There has been relatively little work on volatility in real terms of its application to property derivatives and the real options analysis. Most research on volatility stems from investment performance (Nathakumaran & Newell (1995), Brown & Matysiak 2000, Booth & Matysiak 2001). Historic standard deviation is often used as a proxy for volatility and there has been a reliance on indices, which are subject to valuation smoothing effects. Transaction prices are considered to be more volatile than the traditional standard deviations of appraisal based indices. This could lead, arguably, to inefficiencies and mis-pricing, particularly if it is also accepted that changes evolve randomly over time and where future volatility and not an ex-post measure is the key (Sing 1998). If history does not repeat, or provides an unreliable measure, then estimating model based (implied) volatility is an alternative approach (Patel & Sing 2000). This paper is the first of two that employ alternative approaches to calculating and capturing volatility in UK real estate for the purposes of applying the measure to derivative pricing and real option models. It draws on a uniquely constructed IPD/Gerald Eve transactions database, containing over 21,000 properties over the period 1983-2005. In this first paper the magnitude of historic amplification associated with asset returns by sector and geographic spread is looked at. In the subsequent paper the focus will be upon model based (implied) volatility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In enclosed shopping centres, stores benefit from the positive externalities of other stores in the centre. Some stores provide greater benefits to their neighbours than others – for example anchor tenants and brand leading stores. In managing shopping centres, these positive externalities might be captured through rental variations. This paper explores the determinants of rent – including externalities – for UK regional shopping centres. Two linked databases were utilised in the research. One contains characteristics of 148 shopping centres; the other has some 1,930 individual tenant records including rent level. These data were analysed to provide information on the characteristics of centres and retailers that help determine rent. Factors influencing tenant rents include market potential factors derived from urban and regional economic theory and shopping centre characteristics identified in prior retail research. The model also includes variables that proxy for the interaction between tenants and the impact of positive in-centre externalities. We find that store size is significantly and negatively related to tenant with both anchor and other larger tenants, perhaps as a result of the positive effects generated by their presence, paying relatively lower rents while smaller stores, benefiting from the generation of demand, pay relatively higher rents. Brand leader tenants pay lower rents than other tenants within individual retail categories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Uranium series dating has been carried out on secondary uranyl silicate minerals formed during sub-glacial and post-glacial weathering of Proterozoic uraninite ores in south west Finland. The samples were obtained from two sites adjacent to the Salpauselkä III ice marginal formation and cover a range of depths, from the surface to more than 60 m. Measured ages fall into three distinct groups, 70–100 ka, 28–36 ka and < 2500 yr. The youngest set is associated with surface exposures and the crystals display clear evidence of re-working. The most likely trigger for uranium release at depths below the surface weathering zone is intrusion of oxidising glacial melt water. The latter is often characterised by very high discharge rates along channels, which close once the overpressure generated at the ice margin is released. There is excellent correspondence between the two Finnish sites and published data for similar deposits over a large area of southern and central Sweden. None of the seventy samples analysed gave a U–Th age between 40 and 70 ka; a second hiatus is apparent at 20 ka, coinciding with the Last Glacial Maximum. Thus, the process responsible for uranyl silicate formation was halted for significant periods, owing to a change in geochemical conditions or the hydrogeological regime. These data support the presence of interstadial conditions during the Early and Middle Weichselian since in the absence of major climatic perturbations the uranium phases at depth are stable. When viewed in conjunction with proxy data from mammoth remains it would appear that the region was ice-free prior to the Last Glacial Maximum.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As many as fourteen US states have now mandated minimum service requirements for real estate brokerage relationships in residential transactions. This study attempts to determine whether these minimum service laws have any impact on brokerage competition. Federal government agencies allege such laws discourage competition because they limit the offering of nontraditional brokerage services. However, alternatively, a legislative “bright line” definition of the lowest level of acceptable service may reduce any perceived risk in offering non-traditional brokerage services and therefore encourage competition. Using several empirical strategies and state-level data over nine years (2000-08), we do not find any consistent and significant impact (positive/negative) of minimum services laws on number of licensees per 100 households, our proxy for competition. Interestingly, we also find that association strength, as measured by Realtor association membership penetration, has a strong deterring effect on competition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A detailed analysis is presented of solar UV spectral irradiance for the period between May 2003 and August 2005, when data are available from both the Solar Ultraviolet pectral Irradiance Monitor (SUSIM) instrument (on board the pper Atmosphere Research Satellite (UARS) spacecraft) and the Solar Stellar Irradiance Comparison Experiment (SOLSTICE) instrument (on board the Solar Radiation and Climate Experiment (SORCE) satellite). The ultimate aim is to develop a data composite that can be used to accurately determine any differences between the “exceptional” solar minimum at the end of solar cycle 23 and the previous minimum at the end of solar cycle 22 without having to rely on proxy data to set the long‐term change. SUSIM data are studied because they are the only data available in the “SOLSTICE gap” between the end of available UARS SOLSTICE data and the start of the SORCE data. At any one wavelength the two data sets are considered too dissimilar to be combined into a meaningful composite if any one of three correlations does not exceed a threshold of 0.8. This criterion removes all wavelengths except those in a small range between 156 nm and 208 nm, the longer wavelengths of which influence ozone production and heating in the lower stratosphere. Eight different methods are employed to intercalibrate the two data sequences. All methods give smaller changes between the minima than are seen when the data are not adjusted; however, correcting the SUSIM data to allow for an exponentially decaying offset drift gives a composite that is largely consistent with the unadjusted data from the SOLSTICE instruments on both UARS and SORCE and in which the recent minimum is consistently lower in the wave band studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Pesticide risk indicators provide simple support in the assessment of environmental and health risks from pesticide use, and can therefore inform policies to foster a sustainable interaction of agriculture with the environment. For their relative simplicity, indicators may be particularly useful under conditions of limited data availability and resources, such as in Less Developed Countries (LDCs). However, indicator complexity can vary significantly, in particular between those that rely on an exposure–toxicity ratio (ETR) and those that do not. In addition, pesticide risk indicators are usually developed for Western contexts, which might cause incorrect estimation in LDCs. This study investigated the appropriateness of seven pesticide risk indicators for use in LDCs, with reference to smallholding agriculture in Colombia. Seven farm-level indicators, among which 3 relied on an ETR (POCER, EPRIP, PIRI) and 4 on a non-ETR approach (EIQ, PestScreen, OHRI, Dosemeci et al., 2002), were calculated and then compared by means of the Spearman rank correlation test. Indicators were also compared with respect to key indicator characteristics, i.e. user friendliness and ability to represent the system under study. The comparison of the indicators in terms of the total environmental risk suggests that the indicators not relying on an ETR approach cannot be used as a reliable proxy for more complex, i.e. ETR, indicators. ETR indicators, when user-friendly, show a comparative advantage over non-ETR in best combining the need for a relatively simple tool to be used in contexts of limited data availability and resources, and for a reliable estimation of environmental risk. Non-ETR indicators remain useful and accessible tools to discriminate between different pesticides prior to application. Concerning the human health risk, simple algorithms seem more appropriate for assessing human health risk in LDCs. However, further research on health risk indicators and their validation under LDC conditions is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper redefines technical efficiency by incorporating provision of environmental goods as one of the outputs of the farm. The proportion of permanent and rough grassland to total agricultural land area is used as a proxy for the provision of environmental goods. Stochastic frontier analysis was conducted using a Bayesian procedure. The methodology is applied to panel data on 215 dairy farms in England and Wales. Results show that farm efficiency rankings change when provision of environmental outputs by farms is incorporated in the efficiency analysis, which may have important political implications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A number of transient climate runs simulating the last 120kyr have been carried out using FAMOUS, a fast atmosphere-ocean general circulation model (AOGCM). This is the first time such experiments have been done with a full AOGCM, providing a three-dimensional simulation of both atmosphere and ocean over this period. Our simulation thus includes internally generated temporal variability over periods from days to millennia, and physical, detailed representations of important processes such as clouds and precipitation. Although the model is fast, computational restrictions mean that the rate of change of the forcings has been increased by a factor of 10, making each experiment 12kyr long. Atmospheric greenhouse gases (GHGs), northern hemisphere ice sheets and variations in solar radiation arising from changes in the Earth's orbit are treated as forcing factors, and are applied either separately or combined in different experiments. The long-term temperature changes on Antarctica match well with reconstructions derived from ice-core data, as does variability on timescales longer than 10 kyr. Last Glacial Maximum (LGM) cooling on Greenland is reasonably well simulated, although our simulations, which lack ice-sheet meltwater forcing, do not reproduce the abrupt, millennial scale climate shifts seen in northern hemisphere climate proxies or their slower southern hemisphere counterparts. The spatial pattern of sea surface cooling at the LGM matches proxy reconstructions reasonably well. There is significant anti-correlated variability in the strengths of the Atlantic Meridional Overturning Circulation (AMOC) and the Antarctic Circumpolar Current (ACC) on timescales greater than 10kyr in our experiments. We find that GHG forcing weakens the AMOC and strengthens the ACC, whilst the presence of northern hemisphere ice-sheets strengthens the AMOC and weakens the ACC. The structure of the AMOC at the LGM is found to be sensitive to the details of the ice-sheet reconstruction used. The precessional component of the orbital forcing induces ~20kyr oscillations in the AMOC and ACC, whose amplitude is mediated by changes in the eccentricity of the Earth's orbit. These forcing influences combine, to first order, in a linear fashion to produce the mean climate and ocean variability seen in the run with all forcings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the Last Glacial Maximum (LGM, ∼21,000 years ago) the cold climate was strongly tied to low atmospheric CO2 concentration (∼190 ppm). Although it is generally assumed that this low CO2 was due to an expansion of the oceanic carbon reservoir, simulating the glacial level has remained a challenge especially with the additional δ13C constraint. Indeed the LGM carbon cycle was also characterized by a modern-like δ13C in the atmosphere and a higher surface to deep Atlantic δ13C gradient indicating probable changes in the thermohaline circulation. Here we show with a model of intermediate complexity, that adding three oceanic mechanisms: brine induced stratification, stratification-dependant diffusion and iron fertilization to the standard glacial simulation (which includes sea level drop, temperature change, carbonate compensation and terrestrial carbon release) decreases CO2 down to the glacial value of ∼190 ppm and simultaneously matches glacial atmospheric and oceanic δ13C inferred from proxy data. LGM CO2 and δ13C can at last be successfully reconciled.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

School effectiveness is a microtechnology of change. It is a relay device, which transfers macro policy into everyday processes and priorities in schools. It is part of the growing apparatus of performance evaluation. Change is brought about by a focus on the school as a site-based system to be managed. There has been corporate restructuring in response to the changing political economy of education. There are now new work regimes and radical changes in organizational cultures. Education, like other public services, is now characterized by a range of structural realignments, new relationships between purchasers and providers and new coalitions between management and politics. In this article, we will argue that the school effectiveness movement is an example of new managerialism in education. It is part of an ideological and technological process to industrialize educational productivity. That is to say, the emphasis on standards and standardization is evocative of production regimes drawn from industry. There is a belief that education, like other public services can be managed to ensure optimal outputs and zero defects in the educational product.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hourly winter weather of the Last Glacial Maximum (LGM) is simulated using the Community Climate Model version 3 (CCM3) on a globally resolved T170 (75 km) grid. Results are compared to a longer LGM climatological run with the same boundary conditions and monthly saves. Hourly-scale animations are used to enhance interpretations. The purpose of the study is to explore whether additional insights into ice age conditions can be gleaned by going beyond the standard employment of monthly average model statistics to infer ice age weather and climate. Results for both LGM runs indicate a decrease in North Atlantic and increase in North Pacific cyclogenesis. Storm trajectories react to the mechanical forcing of the Laurentide Ice Sheet, with Pacific storms tracking over middle Alaska and northern Canada, terminating in the Labrador Sea. This result is coincident with other model results in also showing a significant reduction in Greenland wintertime precipitation – a response supported by ice core evidence. Higher-temporal resolution puts in sharper focus the close tracking of Pacific storms along the west coast of North America. This response is consistent with increased poleward heat transport in the LGM climatological run and could help explain “early” glacial warming inferred in this region from proxy climate records. Additional analyses shows a large increase in central Asian surface gustiness that support observational inferences that upper-level winds associated with Asian- Pacific storms transported Asian dust to Greenland during the LGM.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The rapid expansion of the TMT sector in the late 1990s and more recent growing regulatory and corporate focus on business continuity and security have raised the profile of data centres. Data centres offer a unique blend of occupational, physical and technological characteristics compared to conventional real estate assets. Limited trading and heterogeneity of data centres also causes higher levels of appraisal uncertainty. In practice, the application of conventional discounted cash flow approaches requires information about a wide range of inputs that is difficult to derive from limited market signals or estimate analytically. This paper outlines an approach that uses pricing signals from similar traded cash flows is proposed. Based upon ‘the law of one price’, the method draws upon the premise that two identical future cash flows must have the same value now. Given the difficulties of estimating exit values, an alternative is that the expected cash flows of data centre are analysed over the life cycle of the building, with corporate bond yields used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds. Although there are rarely assets that have identical cash flows and some approximation is necessary, the level of appraiser subjectivity is dramatically reduced.