80 resultados para Measure of riskiness
Resumo:
We agree with Duckrow and Albano [Phys. Rev. E 67, 063901 (2003)] and Quian Quiroga [Phys. Rev. E 67, 063902 (2003)] that mutual information (MI) is a useful measure of dependence for electroencephalogram (EEG) data, but we show that the improvement seen in the performance of MI on extracting dependence trends from EEG is more dependent on the type of MI estimator rather than any embedding technique used. In an independent study we conducted in search for an optimal MI estimator, and in particular for EEG applications, we examined the performance of a number of MI estimators on the data set used by Quian Quiroga in their original study, where the performance of different dependence measures on real data was investigated [Phys. Rev. E 65, 041903 (2002)]. We show that for EEG applications the best performance among the investigated estimators is achieved by k-nearest neighbors, which supports the conjecture by Quian Quiroga in Phys. Rev. E 67, 063902 (2003) that the nearest neighbor estimator is the most precise method for estimating MI.
Resumo:
The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.
Resumo:
Climate controls upland habitats, soils and their associated ecosystem services; therefore, understanding possible changes in upland climatic conditions can provide a rapid assessment of climatic vulnerability over the next century. We used 3 different climatic indices that were optimised to fit the upland area classified by the EU as a Severely Disadvantaged Area (SDA) 1961–1990. Upland areas within the SDA covered all altitudinal ranges, whereas the maximum altitude of lowland areas outside of the SDA was ca. 300 m. In general, the climatic index based on the ratio between annual accumulated temperature (as a measure of growing season length) and annual precipitation predicted 96% of the SDA mapped area, which was slightly better than those indices based on annual or seasonal water deficit. Overall, all climatic indices showed that upland environments were exposed to some degree of change by 2071–2100 under UKCIP02 climate projections for high and low emissions scenarios. The projected area declined by 13 to 51% across 3 indices for the low emissions scenario and by 24 to 84% for the high emissions scenario. Mean altitude of the upland area increased by +11 to +86 m for the low scenario and +21 to +178 m for the high scenario. Low altitude areas in eastern and southern Great Britain were most vulnerable to change. These projected climatic changes are likely to affect upland habitat composition, long-term soil carbon storage and wider ecosystem service provision, although it is not yet possible to determine the rate at which this might occur.
Resumo:
A research has been conducted over methodological issues concerning the Theory of Planned Behaviour (TPB) by determining an appropriate measurement (direct and indirect) of constructs and selection of a plausible scaling techniques (unipolar and bipolar) of constructs: attitude, subjective norm, perceived behavioural control and intention that are important in explaining farm level tree planting in Pakistan. Unipolar scoring of beliefs showed higher correlation among the constructs of TPB than bipolar scaling technique. Both direct and indirect methods yielded significant results in explaining intention to perform farm forestry except the belief based measure of perceived behavioural control, which were analysed as statistically non-significant. A need to examine more carefully the scoring of perceived behavioural control (PBC) has been expressed
Resumo:
A series of model experiments with the coupled Max-Planck-Institute ECHAM5/OM climate model have been investigated and compared with microwave measurements from the Microwave Sounding Unit (MSU) and re-analysis data for the period 1979–2008. The evaluation is carried out by computing the Temperature in the Lower Troposphere (TLT) and Temperature in the Middle Troposphere (TMT) using the MSU weights from both University of Alabama (UAH) and Remote Sensing Systems (RSS) and restricting the study to primarily the tropical oceans. When forced by analysed sea surface temperature the model reproduces accurately the time-evolution of the mean outgoing tropospheric microwave radiation especially over tropical oceans but with a minor bias towards higher temperatures in the upper troposphere. The latest reanalyses data from the 25 year Japanese re-analysis (JRA25) and European Center for Medium Range Weather Forecasts Interim Reanalysis are in very close agreement with the time-evolution of the MSU data with a correlation of 0.98 and 0.96, respectively. The re-analysis trends are similar to the trends obtained from UAH but smaller than the trends from RSS. Comparison of TLT, computed from observations from UAH and RSS, with Sea Surface Temperature indicates that RSS has a warm bias after 1993. In order to identify the significance of the tropospheric linear temperature trends we determined the natural variability of 30-year trends from a 500 year control integration of the coupled ECHAM5 model. The model exhibits natural unforced variations of the 30 year tropospheric trend that vary within ±0.2 K/decade for the tropical oceans. This general result is supported by similar results from the Geophysical Fluid Dynamics Laboratory (GFDL) coupled climate model. Present MSU observations from UAH for the period 1979–2008 are well within this range but RSS is close to the upper positive limit of this variability. We have also compared the trend of the vertical lapse rate over the tropical oceans assuming that the difference between TLT and TMT is an approximate measure of the lapse rate. The TLT–TMT trend is larger in both the measurements and in the JRA25 than in the model runs by 0.04–0.06 K/decade. Furthermore, a calculation of all 30 year TLT–TMT trends of the unforced 500-year integration vary between ±0.03 K/decade suggesting that the models have a minor systematic warm bias in the upper troposphere.
Resumo:
Carsberg (2002) suggested that the periodic valuation accuracy studies undertaken by, amongst others, IPD/Drivers Jonas (2003) should be undertaken every year and be sponsored by the RICS, which acts as the self-regulating body for valuations in the UK. This paper does not address the wider issues concerning the nature of properties which are sold and whether the sale prices are influenced by prior valuations, but considers solely the technical issues concerning the timing of the valuation and sales data. This study uses valuations and sales data from the Investment Property Databank UK Monthly Index to attempt to identify the date that sale data is divulged to valuers. This information will inform accuracy studies that use a cut-off date as to the closeness of valuations to sales completion date as a yardstick for excluding data from the analysis. It will also, assuming valuers are informed quickly of any agreed sales, help to determine the actual sale agreed date rather than the completion date, which includes a period of due diligence between when the sale is agreed and its completion. Valuations should be updated to this date, rather than the formal completion date, if a reliable measure of valuation accuracy is to be determined. An accuracy study is then undertaken using a variety of updating periods and the differences between the results are examined. The paper concludes that the sale only becomes known to valuers in the month prior to the sale taking place and that this assumes either that sales due diligence procedures are shortening or valuers are not told quickly of agreed sale prices. Studies that adopt a four-month cut-off date for any valuations compared to sales completion dates are over cautious, and this could be reduced to two months without compromising the data.
Resumo:
The existence of a specialized imitation module in humans is hotly debated. Studies suggesting a specific imitation impairment in individuals with autism spectrum disorders (ASD) support a modular view. However, the voluntary imitation tasks used in these studies (which require socio-cognitive abilities in addition to imitation for successful performance) cannot support claims of a specific impairment. Accordingly, an automatic imitation paradigm (a ‘cleaner’ measure of imitative ability) was used to assess the imitative ability of 16 adults with ASD and 16 non-autistic matched control participants. Participants performed a prespecified hand action in response to observed hand actions performed either by a human or a robotic hand. On compatible trials the stimulus and response actions matched, while on incompatible trials the two actions did not match. Replicating previous findings, the Control group showed an automatic imitation effect: responses on compatible trials were faster than those on incompatible trials. This effect was greater when responses were made to human than to robotic actions (‘animacy bias’). The ASD group also showed an automatic imitation effect and a larger animacy bias than the Control group. We discuss these findings with reference to the literature on imitation in ASD and theories of imitation.
Resumo:
Six land surface models and five global hydrological models participate in a model intercomparison project (WaterMIP), which for the first time compares simulation results of these different classes of models in a consistent way. In this paper the simulation setup is described and aspects of the multi-model global terrestrial water balance are presented. All models were run at 0.5 degree spatial resolution for the global land areas for a 15-year period (1985-1999) using a newly-developed global meteorological dataset. Simulated global terrestrial evapotranspiration, excluding Greenland and Antarctica, ranges from 415 to 586 mm year-1 (60,000 to 85,000 km3 year-1) and simulated runoff ranges from 290 to 457 mm year-1 (42,000 to 66,000 km3 year-1). Both the mean and median runoff fractions for the land surface models are lower than those of the global hydrological models, although the range is wider. Significant simulation differences between land surface and global hydrological models are found to be caused by the snow scheme employed. The physically-based energy balance approach used by land surface models generally results in lower snow water equivalent values than the conceptual degree-day approach used by global hydrological models. Some differences in simulated runoff and evapotranspiration are explained by model parameterizations, although the processes included and parameterizations used are not distinct to either land surface models or global hydrological models. The results show that differences between model are major sources of uncertainty. Climate change impact studies thus need to use not only multiple climate models, but also some other measure of uncertainty, (e.g. multiple impact models).
Resumo:
In this paper sequential importance sampling is used to assess the impact of observations on a ensemble prediction for the decadal path transitions of the Kuroshio Extension (KE). This particle filtering approach gives access to the probability density of the state vector, which allows us to determine the predictive power — an entropy based measure — of the ensemble prediction. The proposed set-up makes use of an ensemble that, at each time, samples the climatological probability distribution. Then, in a post-processing step, the impact of different sets of observations is measured by the increase in predictive power of the ensemble over the climatological signal during one-year. The method is applied in an identical-twin experiment for the Kuroshio Extension using a reduced-gravity shallow water model. We investigate the impact of assimilating velocity observations from different locations during the elongated and the contracted meandering state of the KE. Optimal observations location correspond to regions with strong potential vorticity gradients. For the elongated state the optimal location is in the first meander of the KE. During the contracted state of the KE it is located south of Japan, where the Kuroshio separates from the coast.
Resumo:
Using UK equity index data, this paper considers the impact of news on time varying measures of beta, the usual measure of undiversifiable risk. The empirical model implies that beta depends on news about the market and news about the sector. The asymmetric response of beta to news about the market is consistent across all sectors considered. Recent research is divided as to whether abnormalities in equity returns arise from changes in expected returns in an efficient market or over-reactions to new information. The evidence suggests that such abnormalities may be due to changes in expected returns caused by time-variation and asymmetry in beta.
Resumo:
1. Wild bees are one of the most important groups of pollinators in the temperate zone. Therefore, population declines have potentially negative impacts for both crop and wildflower pollination. Although heavy metal pollution is recognized to be a problem affecting large parts of the European Union, we currently lack insights into the effects of heavy metals on wild bees. 2. We investigated whether heavy metal pollution is a potential threat to wild bee communities by comparing (i) species number, (ii) diversity and (iii) abundance as well as (iv) natural mortality of emerging bees along two independent gradients of heavy metal pollution, one at Olkusz (OLK), Poland and the other at Avonmouth (AVO), UK. We used standardized nesting traps to measure species richness and abundance of wild bees, and we recorded the heavy metal concentration in pollen collected by the red mason bee Osmia rufa as a measure of pollution. 3. The concentration of cadmium, lead and zinc in pollen collected by bees ranged from a background level in unpolluted sites [OLK: 1·3, 43·4, 99·8 (mg kg−1); AVO: 0·8, 42·0, 56·0 (mg kg−1), respectively] to a high level on sites in the vicinity of the OLK and AVO smelters [OLK: 6·7, 277·0, 440·1 (mg kg−1); AVO: 9·3, 356·2, 592·4 (mg kg−1), respectively]. 4. We found that with increasing heavy metal concentration, there was a steady decrease in the number, diversity and abundance of solitary, wild bees. In the most polluted sites, traps were empty or contained single occupants, whereas in unpolluted sites, the nesting traps collected from 4 to 5 species represented by up to ten individuals. Moreover, the proportion of dead individuals of the solitary bee Megachile ligniseca increased along the heavy metal pollution gradient at OLK from 0·2 in uncontaminated sites to 0·5 in sites with a high concentration of pollution. 5. Synthesis and applications. Our findings highlight the negative relationship between heavy metal pollution and populations of wild bees and suggest that increasing wild bee richness in highly contaminated areas will require special conservation strategies. These may include creating suitable nesting sites and sowing a mixture of flowering plants as well as installing artificial nests with wild bee cocoons in polluted areas. Applying protection plans to wild pollinating bee communities in heavy metal-contaminated areas will contribute to integrated land rehabilitation to minimize the impact of pollution on the environment.
Resumo:
This paper investigates the underpricing of IPOs on the Stock Exchange of Mauritius (SEM). Taking into account the whole population of firms which went public since the inception of the SEM until 2010, the results show an average degree of underpricing within the range 10 to 20%. Using a regression approach, we demonstrate that the aftermarket risk level and auditor's reputation both have a significant positive impact on initial returns. We propose the use of the Z-score as a composite measure of a firm's ex ante financial strength, and find that it has a significant negative effect on the degree of short-run underpricing.
Resumo:
This paper examines the interaction of spatial and dynamic aspects of resource extraction from forests by local people. Highly cyclical and varied across space and time, the patterns of resource extraction resulting from the spatial–temporal model bear little resemblance to the patterns drawn from focusing either on spatial or temporal aspects of extraction alone. Ignoring this variability inaccurately depicts villagers’ dependence on different parts of the forest and could result in inappropriate policies. Similarly, the spatial links in extraction decisions imply that policies imposed in one area can have unintended consequences in other areas. Combining the spatial–temporal model with a measure of success in community forest management—the ability to avoid open-access resource degradation—characterizes the impact of incomplete property rights on patterns of resource extraction and stocks.
Resumo:
Nematic monodomain liquid crystalline elastomers have been prepared through in situ cross-linking of an acrylate based side-chain liquid crystalline polymer in a magnetic field. At the nematic–isotropic transition, the sample is found to undergo an anisotropic shape change. There is found to be an increase in dimensions perpendicular — and a decrease parallel — to the director, this is consistent with alignment of the polymer backbone parallel to the direction of mesogen alignment in the nematic state. From a quantitative investigation of this behaviour, we estimate the level of backbone anisotropy for the elastomer. As second measure of the backbone anisotropy, the monodomain sample was physically extended. We have investigated, in particular, the situation where a monodomain sample is deformed with the angle between the director and the extension direction approaching 90°. The behaviour on extension of these acrylate samples is related to alternative theoretical interpretations and the backbone anisotropy determined. Comparison of the chain anisotropy derived from these two approaches and the value obtained from previous small-angle neutron scattering measurements on deuterium labelled mixtures of the same polymer shows that some level of chain anisotropy is retained in the isotropic or more strictly weakly paranematic state of the elastomer. The origin and implications of this behaviour are discussed.
Resumo:
In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.