929 resultados para Shannon’s measure of uncertainty


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Previous research suggests that the phenotype associated with Asperger's syndrome (AS) includes difficulties in understanding the mental states of others, leading to difficulties in social communication and social relationships. It has also been suggested that the first-degree relatives of those with AS can demonstrate similar difficulties, albeit to a lesser extent. This study examined 'theory of mind' (ToM) abilities in the siblings of children with AS relative to a matched control group. Method: 2 7 children who had a sibling with AS were administered the children's version of the 'Eyes Test'(Baron-Cohen, Wheelwright, Stone, & Rutherford, 1999). The control group consisted of 27 children matched for age, sex, and a measure of verbal comprehension, and who did not have a family history of AS/autism. Results: A significant difference was found between the groups on the Eyes Test, the 'siblings' group showing a poorer performance on this measure of social cognition. The difference was more pronounced among female siblings. Discussion: These results are discussed in terms of the familial distribution of a neuro-cognitive profile associated with AS, which confers varying degrees of social handicap amongst first-degree relatives. The implication of this finding with regard to the autism/AS phenotype is explored, with some discussion of why this neuro-cognitive profile (in combination with corresponding strengths) may have an evolutionary imperative.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Preventing childhood overweight and obesity has become a major public health issue in developed and developing countries. Systematic reviews of this topic have not provided practice-relevant guidance because of the generally low quality of research and the heterogeneity of reported effectiveness. Aim: To present practice-relevant guidance on interventions to reduce at least one measure of adiposity in child populations that do or do not contain overweight or obese children. Design: Systematic review of eligible randomized, controlled trials or controlled trials using a novel approach to synthesizing the trial results through application of descriptive epidemiological and realistic evaluation concepts. Eligible trials involved at least 30 participants, lasted at least 12 weeks and involved non-clinical child populations. Results: Twenty-eight eligible trials were identified to 30 April 2006. Eleven trials were effective and 17 were ineffective in reducing adiposity. Blind to outcome, the main factor distinguishing effective from ineffective trials was the provision of moderate to vigorous aerobic physical activity in the former on a relatively 'compulsory' rather than 'voluntary' basis. Conclusions: By using a novel approach to synthesizing trials, a decisive role for the 'compulsory' provision of aerobic physical activity has been demonstrated. Further research is required to identify how such activity can be sustained and transformed into a personally chosen behaviour by children and over the life course. (C) 2007 The Royal Institute of Public Health. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: This paper presents a detailed study of fractal-based methods for texture characterization of mammographic mass lesions and architectural distortion. The purpose of this study is to explore the use of fractal and lacunarity analysis for the characterization and classification of both tumor lesions and normal breast parenchyma in mammography. Materials and methods: We conducted comparative evaluations of five popular fractal dimension estimation methods for the characterization of the texture of mass lesions and architectural distortion. We applied the concept of lacunarity to the description of the spatial distribution of the pixel intensities in mammographic images. These methods were tested with a set of 57 breast masses and 60 normal breast parenchyma (dataset1), and with another set of 19 architectural distortions and 41 normal breast parenchyma (dataset2). Support vector machines (SVM) were used as a pattern classification method for tumor classification. Results: Experimental results showed that the fractal dimension of region of interest (ROIs) depicting mass lesions and architectural distortion was statistically significantly lower than that of normal breast parenchyma for all five methods. Receiver operating characteristic (ROC) analysis showed that fractional Brownian motion (FBM) method generated the highest area under ROC curve (A z = 0.839 for dataset1, 0.828 for dataset2, respectively) among five methods for both datasets. Lacunarity analysis showed that the ROIs depicting mass lesions and architectural distortion had higher lacunarities than those of ROIs depicting normal breast parenchyma. The combination of FBM fractal dimension and lacunarity yielded the highest A z value (0.903 and 0.875, respectively) than those based on single feature alone for both given datasets. The application of the SVM improved the performance of the fractal-based features in differentiating tumor lesions from normal breast parenchyma by generating higher A z value. Conclusion: FBM texture model is the most appropriate model for characterizing mammographic images due to self-affinity assumption of the method being a better approximation. Lacunarity is an effective counterpart measure of the fractal dimension in texture feature extraction in mammographic images. The classification results obtained in this work suggest that the SVM is an effective method with great potential for classification in mammographic image analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We agree with Duckrow and Albano [Phys. Rev. E 67, 063901 (2003)] and Quian Quiroga et al. [Phys. Rev. E 67, 063902 (2003)] that mutual information (MI) is a useful measure of dependence for electroencephalogram (EEG) data, but we show that the improvement seen in the performance of MI on extracting dependence trends from EEG is more dependent on the type of MI estimator rather than any embedding technique used. In an independent study we conducted in search for an optimal MI estimator, and in particular for EEG applications, we examined the performance of a number of MI estimators on the data set used by Quian Quiroga et al. in their original study, where the performance of different dependence measures on real data was investigated [Phys. Rev. E 65, 041903 (2002)]. We show that for EEG applications the best performance among the investigated estimators is achieved by k-nearest neighbors, which supports the conjecture by Quian Quiroga et al. in Phys. Rev. E 67, 063902 (2003) that the nearest neighbor estimator is the most precise method for estimating MI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We agree with Duckrow and Albano [Phys. Rev. E 67, 063901 (2003)] and Quian Quiroga [Phys. Rev. E 67, 063902 (2003)] that mutual information (MI) is a useful measure of dependence for electroencephalogram (EEG) data, but we show that the improvement seen in the performance of MI on extracting dependence trends from EEG is more dependent on the type of MI estimator rather than any embedding technique used. In an independent study we conducted in search for an optimal MI estimator, and in particular for EEG applications, we examined the performance of a number of MI estimators on the data set used by Quian Quiroga in their original study, where the performance of different dependence measures on real data was investigated [Phys. Rev. E 65, 041903 (2002)]. We show that for EEG applications the best performance among the investigated estimators is achieved by k-nearest neighbors, which supports the conjecture by Quian Quiroga in Phys. Rev. E 67, 063902 (2003) that the nearest neighbor estimator is the most precise method for estimating MI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SCIENTIFIC SUMMARY Globally averaged total column ozone has declined over recent decades due to the release of ozone-depleting substances (ODSs) into the atmosphere. Now, as a result of the Montreal Protocol, ozone is expected to recover from the effects of ODSs as ODS abundances decline in the coming decades. However, a number of factors in addition to ODSs have led to and will continue to lead to changes in ozone. Discriminating between the causes of past and projected ozone changes is necessary, not only to identify the progress in ozone recovery from ODSs, but also to evaluate the effectiveness of climate and ozone protection policy options. Factors Affecting Future Ozone and Surface Ultraviolet Radiation • At least for the next few decades, the decline of ODSs is expected to be the major factor affecting the anticipated increase in global total column ozone. However, several factors other than ODS will affect the future evolution of ozone in the stratosphere. These include changes in (i) stratospheric circulation and temperature due to changes in long-lived greenhouse gas (GHG) abundances, (ii) stratospheric aerosol loading, and (iii) source gases of highly reactive stratospheric hydrogen and nitrogen compounds. Factors that amplify the effects of ODSs on ozone (e.g., stratospheric aerosols) will likely decline in importance as ODSs are gradually eliminated from the atmosphere. • Increases in GHG emissions can both positively and negatively affect ozone. Carbon dioxide (CO2)-induced stratospheric cooling elevates middle and upper stratospheric ozone and decreases the time taken for ozone to return to 1980 levels, while projected GHG-induced increases in tropical upwelling decrease ozone in the tropical lower stratosphere and increase ozone in the extratropics. Increases in nitrous oxide (N2O) and methane (CH4) concentrations also directly impact ozone chemistry but the effects are different in different regions. • The Brewer-Dobson circulation (BDC) is projected to strengthen over the 21st century and thereby affect ozone amounts. Climate models consistently predict an acceleration of the BDC or, more specifically, of the upwelling mass flux in the tropical lower stratosphere of around 2% per decade as a consequence of GHG abundance increases. A stronger BDC would decrease the abundance of tropical lower stratospheric ozone, increase poleward transport of ozone, and could reduce the atmospheric lifetimes of long-lived ODSs and other trace gases. While simulations showing faster ascent in the tropical lower stratosphere to date are a robust feature of chemistry-climate models (CCMs), this has not been confirmed by observations and the responsible mechanisms remain unclear. • Substantial ozone losses could occur if stratospheric aerosol loading were to increase in the next few decades, while halogen levels are high. Stratospheric aerosol increases may be caused by sulfur contained in volcanic plumes entering the stratosphere or from human activities. The latter might include attempts to geoengineer the climate system by enhancing the stratospheric aerosol layer. The ozone losses mostly result from enhanced heterogeneous chemistry on stratospheric aerosols. Enhanced aerosol heating within the stratosphere also leads to changes in temperature and circulation that affect ozone. • Surface ultraviolet (UV) levels will not be affected solely by ozone changes but also by the effects of climate change and by air quality change in the troposphere. These tropospheric effects include changes in clouds, tropospheric aerosols, surface reflectivity, and tropospheric sulfur dioxide (SO2) and nitrogen dioxide (NO2). The uncertainties in projections of these factors are large. Projected increases in tropospheric ozone are more certain and may lead to reductions in surface erythemal (“sunburning”) irradiance of up to 10% by 2100. Changes in clouds may lead to decreases or increases in surface erythemal irradiance of up to 15% depending on latitude. Expected Future Changes in Ozone Full ozone recovery from the effects of ODSs and return of ozone to historical levels are not synonymous. In this chapter a key target date is chosen to be 1980, in part to retain the connection to previous Ozone Assessments. Noting, however, that decreases in ozone may have occurred in some regions of the atmosphere prior to 1980, 1960 return dates are also reported. The projections reported on in this chapter are taken from a recent compilation of CCM simulations. The ozone projections, which also form the basis for the UV projections, are limited in their representativeness of possible futures since they mostly come from CCM simulations based on a single GHG emissions scenario (scenario A1B of Emissions Scenarios. A Special Report of Working Group III of the Intergovernmental Panel on Climate Change, Cambridge University Press, 2000) and a single ODS emissions scenario (adjusted A1 of the previous (2006) Ozone Assessment). Throughout this century, the vertical, latitudinal, and seasonal structure of the ozone distribution will be different from what it was in 1980. For this reason, ozone changes in different regions of the atmosphere are considered separately. • The projections of changes in ozone and surface clear-sky UV are broadly consistent with those reported on in the 2006 Assessment. • The capability of making projections and attribution of future ozone changes has been improved since the 2006 Assessment. Use of CCM simulations from an increased number of models extending through the entire period of ozone depletion and recovery from ODSs (1960–2100) as well as sensitivity simulations have allowed more robust projections of long-term changes in the stratosphere and of the relative contributions of ODSs and GHGs to those changes. • Global annually averaged total column ozone is projected to return to 1980 levels before the middle of the century and earlier than when stratospheric halogen loading returns to 1980 levels. CCM projections suggest that this early return is primarily a result of GHG-induced cooling of the upper stratosphere because the effects of circulation changes on tropical and extratropical ozone largely cancel. Global (90°S–90°N) annually averaged total column ozone will likely return to 1980 levels between 2025 and 2040, well before the return of stratospheric halogens to 1980 levels between 2045 and 2060. • Simulated changes in tropical total column ozone from 1960 to 2100 are generally small. The evolution of tropical total column ozone in models depends on the balance between upper stratospheric increases and lower stratospheric decreases. The upper stratospheric increases result from declining ODSs and a slowing of ozone destruction resulting from GHG-induced cooling. Ozone decreases in the lower stratosphere mainly result from an increase in tropical upwelling. From 1960 until around 2000, a general decline is simulated, followed by a gradual increase to values typical of 1980 by midcentury. Thereafter, although total column ozone amounts decline slightly again toward the end of the century, by 2080 they are no longer expected to be affected by ODSs. Confidence in tropical ozone projections is compromised by the fact that simulated decreases in column ozone to date are not supported by observations, suggesting that significant uncertainties remain. • Midlatitude total column ozone is simulated to evolve differently in the two hemispheres. Over northern midlatitudes, annually averaged total column ozone is projected to return to 1980 values between 2015 and 2030, while for southern midlatitudes the return to 1980 values is projected to occur between 2030 and 2040. The more rapid return to 1980 values in northern midlatitudes is linked to a more pronounced strengthening of the poleward transport of ozone due to the effects of increased GHG levels, and effects of Antarctic ozone depletion on southern midlatitudes. By 2100, midlatitude total column ozone is projected to be above 1980 values in both hemispheres. • October-mean Antarctic total column ozone is projected to return to 1980 levels after midcentury, later than in any other region, and yet earlier than when stratospheric halogen loading is projected to return to 1980 levels. The slightly earlier return of ozone to 1980 levels (2045–2060) results primarily from upper stratospheric cooling and resultant increases in ozone. The return of polar halogen loading to 1980 levels (2050–2070) in CCMs is earlier than in empirical models that exclude the effects of GHG-induced changes in circulation. Our confidence in the drivers of changes in Antarctic ozone is higher than for other regions because (i) ODSs exert a strong influence on Antarctic ozone, (ii) the effects of changes in GHG abundances are comparatively small, and (iii) projections of ODS emissions are more certain than those for GHGs. Small Antarctic ozone holes (areas of ozone <220 Dobson units, DU) could persist to the end of the 21st century. • March-mean Arctic total column ozone is projected to return to 1980 levels two to three decades before polar halogen loading returns to 1980 levels, and to exceed 1980 levels thereafter. While CCM simulations project a return to 1980 levels between 2020 and 2035, most models tend not to capture observed low temperatures and thus underestimate present-day Arctic ozone loss such that it is possible that this return date is biased early. Since the strengthening of the Brewer-Dobson circulation through the 21st century leads to increases in springtime Arctic column ozone, by 2100 Arctic ozone is projected to lie well above 1960 levels. Uncertainties in Projections • Conclusions dependent on future GHG levels are less certain than those dependent on future ODS levels since ODS emissions are controlled by the Montreal Protocol. For the six GHG scenarios considered by a few CCMs, the simulated differences in stratospheric column ozone over the second half of the 21st century are largest in the northern midlatitudes and the Arctic, with maximum differences of 20–40 DU between the six scenarios in 2100. • There remain sources of uncertainty in the CCM simulations. These include the use of prescribed ODS mixing ratios instead of emission fluxes as lower boundary conditions, the range of sea surface temperatures and sea ice concentrations, missing tropospheric chemistry, model parameterizations, and model climate sensitivity. • Geoengineering schemes for mitigating climate change by continuous injections of sulfur-containing compounds into the stratosphere, if implemented, would substantially affect stratospheric ozone, particularly in polar regions. Ozone losses observed following large volcanic eruptions support this prediction. However, sporadic volcanic eruptions provide limited analogs to the effects of continuous sulfur emissions. Preliminary model simulations reveal large uncertainties in assessing the effects of continuous sulfur injections. Expected Future Changes in Surface UV. While a number of factors, in addition to ozone, affect surface UV irradiance, the focus in this chapter is on the effects of changes in stratospheric ozone on surface UV. For this reason, clear-sky surface UV irradiance is calculated from ozone projections from CCMs. • Projected increases in midlatitude ozone abundances during the 21st century, in the absence of changes in other factors, in particular clouds, tropospheric aerosols, and air pollutants, will result in decreases in surface UV irradiance. Clear-sky erythemal irradiance is projected to return to 1980 levels on average in 2025 for the northern midlatitudes, and in 2035 for the southern midlatitudes, and to fall well below 1980 values by the second half of the century. However, actual changes in surface UV will be affected by a number of factors other than ozone. • In the absence of changes in other factors, changes in tropical surface UV will be small because changes in tropical total column ozone are projected to be small. By the middle of the 21st century, the model projections suggest surface UV to be slightly higher than in the 1960s, very close to values in 1980, and slightly lower than in 2000. The projected decrease in tropical total column ozone through the latter half of the century will likely result in clear-sky surface UV remaining above 1960 levels. Average UV irradiance is already high in the tropics due to naturally occurring low total ozone columns and high solar elevations. • The magnitude of UV changes in the polar regions is larger than elsewhere because ozone changes in polar regions are larger. For the next decades, surface clear-sky UV irradiance, particularly in the Antarctic, will continue to be higher than in 1980. Future increases in ozone and decreases in clear-sky UV will occur at slower rates than those associated with the ozone decreases and UV increases that occurred before 2000. In Antarctica, surface clear-sky UV is projected to return to 1980 levels between 2040 and 2060, while in the Arctic this is projected to occur between 2020 and 2030. By 2100, October surface clear-sky erythemal irradiance in Antarctica is likely to be between 5% below to 25% above 1960 levels, with considerable uncertainty. This is consistent with multi-model-mean October Antarctic total column ozone not returning to 1960 levels by 2100. In contrast, by 2100, surface clear-sky UV in the Arctic is projected to be 0–10% below 1960 levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A significant challenge in the prediction of climate change impacts on ecosystems and biodiversity is quantifying the sources of uncertainty that emerge within and between different models. Statistical species niche models have grown in popularity, yet no single best technique has been identified reflecting differing performance in different situations. Our aim was to quantify uncertainties associated with the application of 2 complimentary modelling techniques. Generalised linear mixed models (GLMM) and generalised additive mixed models (GAMM) were used to model the realised niche of ombrotrophic Sphagnum species in British peatlands. These models were then used to predict changes in Sphagnum cover between 2020 and 2050 based on projections of climate change and atmospheric deposition of nitrogen and sulphur. Over 90% of the variation in the GLMM predictions was due to niche model parameter uncertainty, dropping to 14% for the GAMM. After having covaried out other factors, average variation in predicted values of Sphagnum cover across UK peatlands was the next largest source of variation (8% for the GLMM and 86% for the GAMM). The better performance of the GAMM needs to be weighed against its tendency to overfit the training data. While our niche models are only a first approximation, we used them to undertake a preliminary evaluation of the relative importance of climate change and nitrogen and sulphur deposition and the geographic locations of the largest expected changes in Sphagnum cover. Predicted changes in cover were all small (generally <1% in an average 4 m2 unit area) but also highly uncertain. Peatlands expected to be most affected by climate change in combination with atmospheric pollution were Dartmoor, Brecon Beacons and the western Lake District.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate controls upland habitats, soils and their associated ecosystem services; therefore, understanding possible changes in upland climatic conditions can provide a rapid assessment of climatic vulnerability over the next century. We used 3 different climatic indices that were optimised to fit the upland area classified by the EU as a Severely Disadvantaged Area (SDA) 1961–1990. Upland areas within the SDA covered all altitudinal ranges, whereas the maximum altitude of lowland areas outside of the SDA was ca. 300 m. In general, the climatic index based on the ratio between annual accumulated temperature (as a measure of growing season length) and annual precipitation predicted 96% of the SDA mapped area, which was slightly better than those indices based on annual or seasonal water deficit. Overall, all climatic indices showed that upland environments were exposed to some degree of change by 2071–2100 under UKCIP02 climate projections for high and low emissions scenarios. The projected area declined by 13 to 51% across 3 indices for the low emissions scenario and by 24 to 84% for the high emissions scenario. Mean altitude of the upland area increased by +11 to +86 m for the low scenario and +21 to +178 m for the high scenario. Low altitude areas in eastern and southern Great Britain were most vulnerable to change. These projected climatic changes are likely to affect upland habitat composition, long-term soil carbon storage and wider ecosystem service provision, although it is not yet possible to determine the rate at which this might occur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The retention of peatland carbon (C) and the ability to continue to draw down and store C from the atmosphere is not only important for the UK terrestrial carbon inventory, but also for a range of ecosystem services, the landscape value and the ecology and hydrology of ~15% of the land area of the UK. Here we review the current state of knowledge on the C balance of UK peatlands using several studies which highlight not only the importance of making good flux measurements, but also the spatial and temporal variability of different flux terms that characterise a landscape affected by a range of natural and anthropogenic processes and threats. Our data emphasise the importance of measuring (or accurately estimating) all components of the peatland C budget. We highlight the role of the aquatic pathway and suggest that fluxes are higher than previously thought. We also compare the contemporary C balance of several UK peatlands with historical rates of C accumulation measured using peat cores, thus providing a long-term context for present-day measurements and their natural year-on-year variability. Contemporary measurements from 2 sites suggest that current accumulation rates (–56 to –72 g C m–2 yr–1) are at the lower end of those seen over the last 150 yr in peat cores (–35 to –209 g C m–2 yr–1). Finally, we highlight significant current gaps in knowledge and identify where levels of uncertainty are high, as well as emphasise the research challenges that need to be addressed if we are to improve the measurement and prediction of change in the peatland C balance over future decades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A research has been conducted over methodological issues concerning the Theory of Planned Behaviour (TPB) by determining an appropriate measurement (direct and indirect) of constructs and selection of a plausible scaling techniques (unipolar and bipolar) of constructs: attitude, subjective norm, perceived behavioural control and intention that are important in explaining farm level tree planting in Pakistan. Unipolar scoring of beliefs showed higher correlation among the constructs of TPB than bipolar scaling technique. Both direct and indirect methods yielded significant results in explaining intention to perform farm forestry except the belief based measure of perceived behavioural control, which were analysed as statistically non-significant. A need to examine more carefully the scoring of perceived behavioural control (PBC) has been expressed

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A series of model experiments with the coupled Max-Planck-Institute ECHAM5/OM climate model have been investigated and compared with microwave measurements from the Microwave Sounding Unit (MSU) and re-analysis data for the period 1979–2008. The evaluation is carried out by computing the Temperature in the Lower Troposphere (TLT) and Temperature in the Middle Troposphere (TMT) using the MSU weights from both University of Alabama (UAH) and Remote Sensing Systems (RSS) and restricting the study to primarily the tropical oceans. When forced by analysed sea surface temperature the model reproduces accurately the time-evolution of the mean outgoing tropospheric microwave radiation especially over tropical oceans but with a minor bias towards higher temperatures in the upper troposphere. The latest reanalyses data from the 25 year Japanese re-analysis (JRA25) and European Center for Medium Range Weather Forecasts Interim Reanalysis are in very close agreement with the time-evolution of the MSU data with a correlation of 0.98 and 0.96, respectively. The re-analysis trends are similar to the trends obtained from UAH but smaller than the trends from RSS. Comparison of TLT, computed from observations from UAH and RSS, with Sea Surface Temperature indicates that RSS has a warm bias after 1993. In order to identify the significance of the tropospheric linear temperature trends we determined the natural variability of 30-year trends from a 500 year control integration of the coupled ECHAM5 model. The model exhibits natural unforced variations of the 30 year tropospheric trend that vary within ±0.2 K/decade for the tropical oceans. This general result is supported by similar results from the Geophysical Fluid Dynamics Laboratory (GFDL) coupled climate model. Present MSU observations from UAH for the period 1979–2008 are well within this range but RSS is close to the upper positive limit of this variability. We have also compared the trend of the vertical lapse rate over the tropical oceans assuming that the difference between TLT and TMT is an approximate measure of the lapse rate. The TLT–TMT trend is larger in both the measurements and in the JRA25 than in the model runs by 0.04–0.06 K/decade. Furthermore, a calculation of all 30 year TLT–TMT trends of the unforced 500-year integration vary between ±0.03 K/decade suggesting that the models have a minor systematic warm bias in the upper troposphere.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Commercial real estate investors have well-established methods to assess the risks of a property investment in their home country. However, when the investment decision is overseas another dimension of uncertainty overlays the analysis. This additional dimension, typically called country risk, encompasses the uncertainty of achieving expected financial results solely due to factors relating to the investment’s location in another country. However, very little has been done to examine the effects of country risk on international real estate returns, even though in international investment decisions considerations of country risk dominate asset investment decisions. This study extends the literature on international real estate diversification by empirically estimating the impact of country risk, as measured by Euromoney, on the direct real estate returns of 15 countries over the period 1998-2004, using a pooled regression analysis approach. The results suggest that country risk data may help investor’s in their international real estate decisions since the country risk data shows a significant and consistent impact on real estate return performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Uncertainty affects all aspects of the property market but one area where the impact of uncertainty is particularly significant is within feasibility analyses. Any development is impacted by differences between market conditions at the conception of the project and the market realities at the time of completion. The feasibility study needs to address the possible outcomes based on an understanding of the current market. This requires the appraiser to forecast the most likely outcome relating to the sale price of the completed development, the construction costs and the timing of both. It also requires the appraiser to understand the impact of finance on the project. All these issues are time sensitive and analysis needs to be undertaken to show the impact of time to the viability of the project. The future is uncertain and a full feasibility analysis should be able to model the upside and downside risk pertaining to a range of possible outcomes. Feasibility studies are extensively used in Italy to determine land value but they tend to be single point analysis based upon a single set of “likely” inputs. In this paper we look at the practical impact of uncertainty in variables using a simulation model (Crystal Ball ©) with an actual case study of an urban redevelopment plan for an Italian Municipality. This allows the appraiser to address the issues of uncertainty involved and thus provide the decision maker with a better understanding of the risk of development. This technique is then refined using a “two-dimensional technique” to distinguish between “uncertainty” and “variability” and thus create a more robust model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carsberg (2002) suggested that the periodic valuation accuracy studies undertaken by, amongst others, IPD/Drivers Jonas (2003) should be undertaken every year and be sponsored by the RICS, which acts as the self-regulating body for valuations in the UK. This paper does not address the wider issues concerning the nature of properties which are sold and whether the sale prices are influenced by prior valuations, but considers solely the technical issues concerning the timing of the valuation and sales data. This study uses valuations and sales data from the Investment Property Databank UK Monthly Index to attempt to identify the date that sale data is divulged to valuers. This information will inform accuracy studies that use a cut-off date as to the closeness of valuations to sales completion date as a yardstick for excluding data from the analysis. It will also, assuming valuers are informed quickly of any agreed sales, help to determine the actual sale agreed date rather than the completion date, which includes a period of due diligence between when the sale is agreed and its completion. Valuations should be updated to this date, rather than the formal completion date, if a reliable measure of valuation accuracy is to be determined. An accuracy study is then undertaken using a variety of updating periods and the differences between the results are examined. The paper concludes that the sale only becomes known to valuers in the month prior to the sale taking place and that this assumes either that sales due diligence procedures are shortening or valuers are not told quickly of agreed sale prices. Studies that adopt a four-month cut-off date for any valuations compared to sales completion dates are over cautious, and this could be reduced to two months without compromising the data.