930 resultados para UNCERTAINTY PRINCIPLE


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Attitudes towards risk and uncertainty have been indicated to be highly context-dependent, and to be sensitive to the measurement technique employed. We present data collected in controlled experiments with 2,939 subjects in 30 countries measuring risk and uncertainty attitudes through incentivized measures as well as survey questions. Our data show clearly that measures correlate not only within decision contexts or measurement methods, but also across contexts and methods. This points to the existence of one underlying “risk preference”, which influences attitudes independently of the measurement method or choice domain. We furthermore find that answers to a general and a financial survey question correlate with incentivized lottery choices in most countries. Incentivized and survey measures also correlate significantly between countries. This opens the possibility to conduct cultural comparisons on risk attitudes using survey instruments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methods to explicitly represent uncertainties in weather and climate models have been developed and refined over the past decade, and have reduced biases and improved forecast skill when implemented in the atmospheric component of models. These methods have not yet been applied to the land surface component of models. Since the land surface is strongly coupled to the atmospheric state at certain times and in certain places (such as the European summer of 2003), improvements in the representation of land surface uncertainty may potentially lead to improvements in atmospheric forecasts for such events. Here we analyse seasonal retrospective forecasts for 1981–2012 performed with the European Centre for Medium-Range Weather Forecasts’ (ECMWF) coupled ensemble forecast model. We consider two methods of incorporating uncertainty into the land surface model (H-TESSEL): stochastic perturbation of tendencies, and static perturbation of key soil parameters. We find that the perturbed parameter approach considerably improves the forecast of extreme air temperature for summer 2003, through better representation of negative soil moisture anomalies and upward sensible heat flux. Averaged across all the reforecasts the perturbed parameter experiment shows relatively little impact on the mean bias, suggesting perturbations of at least this magnitude can be applied to the land surface without any degradation of model climate. There is also little impact on skill averaged across all reforecasts and some evidence of overdispersion for soil moisture. The stochastic tendency experiments show a large overdispersion for the soil temperature fields, indicating that the perturbation here is too strong. There is also some indication that the forecast of the 2003 warm event is improved for the stochastic experiments, however the improvement is not as large as observed for the perturbed parameter experiment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Debate over the late Quaternary megafaunal extinctions has focussed on whether human colonisation or climatic changes were more important drivers of extinction, with few extinctions being unambiguously attributable to either. Most analyses have been geographically or taxonomically restricted and the few quantitative global analyses have been limited by coarse temporal resolution or overly simplified climate reconstructions or proxies. We present a global analysis of the causes of these extinctions which uses high-resolution climate reconstructions and explicitly investigates the sensitivity of our results to uncertainty in the palaeological record. Our results show that human colonisation was the dominant driver of megafaunal extinction across the world but that climatic factors were also important. We identify the geographic regions where future research is likely to have the most impact, with our models reliably predicting extinctions across most of the world, with the notable exception of mainland Asia where we fail to explain the apparently low rate of extinction found in in the fossil record. Our results are highly robust to uncertainties in the palaeological record, and our main conclusions are unlikely to change qualitatively following minor improvements or changes in the dates of extinctions and human colonisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Model-based estimates of future uncertainty are generally based on the in-sample fit of the model, as when Box-Jenkins prediction intervals are calculated. However, this approach will generate biased uncertainty estimates in real time when there are data revisions. A simple remedy is suggested, and used to generate more accurate prediction intervals for 25 macroeconomic variables, in line with the theory. A simulation study based on an empirically-estimated model of data revisions for US output growth is used to investigate small-sample properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Coordination of activity between the amygdala and ventromedial prefrontal cortex (vmPFC) is important for fear-extinction learning. Aberrant recruitment of this circuitry is associated with anxiety disorders. Here, we sought to determine if individual differences in future threat uncertainty sensitivity, a potential risk factor for anxiety disorders, underly compromised recruitment of fear extinction circuitry. Twenty-two healthy subjects completed a cued fear conditioning task with acquisition and extinction phases. During the task, pupil dilation, skin conductance response, and functional magnetic resonance imaging were acquired. We assessed the temporality of fear extinction learning by splitting the extinction phase into early and late extinction. Threat uncertainty sensitivity was measured using self-reported intolerance of uncertainty (IU). Results: During early extinction learning, we found low IU scores to be associated with larger skin conductance responses and right amygdala activity to learned threat vs. safety cues, whereas high IU scores were associated with no skin conductance discrimination and greater activity within the right amygdala to previously learned safety cues. In late extinction learning, low IU scores were associated with successful inhibition of previously learned threat, reflected in comparable skin conductance response and right amgydala activity to learned threat vs. safety cues, whilst high IU scores were associated with continued fear expression to learned threat, indexed by larger skin conductance and amygdala activity to threat vs. safety cues. In addition, high IU scores were associated with greater vmPFC activity to threat vs. safety cues in late extinction. Similar patterns of IU and extinction learning were found for pupil dilation. The results were specific for IU and did not generalize to self-reported trait anxiety. Conclusions: Overall, the neural and psychophysiological patterns observed here suggest high IU individuals to disproportionately generalize threat during times of uncertainty, which subsequently compromises fear extinction learning. More broadly, these findings highlight the potential of intolerance of uncertainty-based mechanisms to help understand pathological fear in anxiety disorders and inform potential treatment targets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Risk attitudes are known to be sensitive to large stake variations. However, little is known on the sensitivity to moderate variations in stakes. This is important for studies that want to compare risk attitudes between countries or over time. I find that variations of ±20% affect only utility, while larger variations may affect also probability weighting. Surprisingly, the effect on weighting functions is larger for losses than for gains. It is also more pronounced for risk than for uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Model simulations of the next few decades are widely used in assessments of climate change impacts and as guidance for adaptation. Their non-linear nature reveals a level of irreducible uncertainty which it is important to understand and quantify, especially for projections of near-term regional climate. Here we use large idealised initial condition ensembles of the FAMOUS global climate model with a 1 %/year compound increase in CO2 levels to quantify the range of future temperatures in model-based projections. These simulations explore the role of both atmospheric and oceanic initial conditions and are the largest such ensembles to date. Short-term simulated trends in global temperature are diverse, and cooling periods are more likely to be followed by larger warming rates. The spatial pattern of near-term temperature change varies considerably, but the proportion of the surface showing a warming is more consistent. In addition, ensemble spread in inter-annual temperature declines as the climate warms, especially in the North Atlantic. Over Europe, atmospheric initial condition uncertainty can, for certain ocean initial conditions, lead to 20 year trends in winter and summer in which every location can exhibit either strong cooling or rapid warming. However, the details of the distribution are highly sensitive to the ocean initial condition chosen and particularly the state of the Atlantic meridional overturning circulation. On longer timescales, the warming signal becomes more clear and consistent amongst different initial condition ensembles. An ensemble using a range of different oceanic initial conditions produces a larger spread in temperature trends than ensembles using a single ocean initial condition for all lead times. This highlights the potential benefits from initialising climate predictions from ocean states informed by observations. These results suggest that climate projections need to be performed with many more ensemble members than at present, using a range of ocean initial conditions, if the uncertainty in near-term regional climate is to be adequately quantified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Uncertainty of Arctic seasonal to interannual predictions arising from model errors and initial state uncertainty has been widely discussed in the literature, whereas the irreducible forecast uncertainty (IFU) arising from the chaoticity of the climate system has received less attention. However, IFU provides important insights into the mechanisms through which predictability is lost, and hence can inform prioritization of model development and observations deployment. Here, we characterize how internal oceanic and surface atmospheric heat fluxes contribute to IFU of Arctic sea ice and upper ocean heat content in an Earth system model by analyzing a set of idealized ensemble prediction experiments. We find that atmospheric and oceanic heat flux are often equally important for driving unpredictable Arctic-wide changes in sea ice and surface water temperatures, and hence contribute equally to IFU. Atmospheric surface heat flux tends to dominate Arctic-wide changes for lead times of up to a year, whereas oceanic heat flux tends to dominate regionally and on interannual time scales. There is in general a strong negative covariance between surface heat flux and ocean vertical heat flux at depth, and anomalies of lateral ocean heat transport are wind-driven, which suggests that the unpredictable oceanic heat flux variability is mainly forced by the atmosphere. These results are qualitatively robust across different initial states, but substantial variations in the amplitude of IFU exist. We conclude that both atmospheric variability and the initial state of the upper ocean are key ingredients for predictions of Arctic surface climate on seasonal to interannual time scales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A generalization of Arakawa and Schubert's convective quasi-equilibrium principle is presented for a closure formulation of mass-flux convection parameterization. The original principle is based on the budget of the cloud work function. This principle is generalized by considering the budget for a vertical integral of an arbitrary convection-related quantity. The closure formulation includes Arakawa and Schubert's quasi-equilibrium, as well as both CAPE and moisture closures as special cases. The formulation also includes new possibilities for considering vertical integrals that are dependent on convective-scale variables, such as the moisture within convection. The generalized convective quasi-equilibrium is defined by a balance between large-scale forcing and convective response for a given vertically-integrated quantity. The latter takes the form of a convolution of a kernel matrix and a mass-flux spectrum, as in the original convective quasi-equilibrium. The kernel reduces to a scalar when either a bulk formulation is adopted, or only large-scale variables are considered within the vertical integral. Various physical implications of the generalized closure are discussed. These include the possibility that precipitation might be considered as a potentially-significant contribution to the large-scale forcing. Two dicta are proposed as guiding physical principles for the specifying a suitable vertically-integrated quantity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While there is an extensive and still growing body of literature on women in academia and the challenges they encounter in career progression, there is little research on their experience specifically within a business school setting. In this study, we attempt to address this gap and examine the experiences and career development of female academics in a business school and how these are impacted by downsizing programmes. To this end, an exploratory case study is conducted. The findings of this study show that female business school academics experience numerous challenges in terms of promotion and development, networking, and the multiple and conflicting demands placed upon them. As a result, the lack of visibility seems to be a pertinent issue in terms of their career progression. Our data also demonstrates that that, paradoxically, during periods of downsizing women become more visible and thus vulnerable to layoffs as a consequence of the challenges and pressures created in their environment during this process. In this paper, we argue that this heightened visibility, and being subject to possible layoffs, further reproduces inequality regimes in academia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The polynyas of the Laptev Sea are regions of particular interest due to the strong formation of Arctic sea-ice. In order to simulate the polynya dynamics and to quantify ice production, we apply the Finite Element Sea-Ice Ocean Model FESOM. In previous simulations FESOM has been forced with daily atmospheric NCEP (National Centers for Environmental Prediction) 1. For the periods 1 April to 9 May 2008 and 1 January to 8 February 2009 we examine the impact of different forcing data: daily and 6-hourly NCEP reanalyses 1 (1.875° x 1.875°), 6-hourly NCEP reanalyses 2 (1.875° x 1.875°), 6-hourly analyses from the GME (Global Model of the German Weather Service) (0.5° x 0.5°) and high-resolution hourly COSMO (Consortium for Small-Scale Modeling) data (5 km x 5 km). In all FESOM simulations, except for those with 6-hourly and daily NCEP 1 data, the openings and closings of polynyas are simulated in principle agreement with satellite products. Over the fast-ice area the wind fields of all atmospheric data are similar and close to in situ measurements. Over the polynya areas, however, there are strong differences between the forcing data with respect to air temperature and turbulent heat flux. These differences have a strong impact on sea-ice production rates. Depending on the forcing fields polynya ice production ranges from 1.4 km3 to 7.8 km3 during 1 April to 9 May 2011 and from 25.7 km3 to 66.2 km3 during 1 January to 8 February 2009. Therefore, atmospheric forcing data with high spatial and temporal resolution which account for the presence of the polynyas are needed to reduce the uncertainty in quantifying ice production in polynyas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We establish a methodology for calculating uncertainties in sea surface temperature estimates from coefficient based satellite retrievals. The uncertainty estimates are derived independently of in-situ data. This enables validation of both the retrieved SSTs and their uncertainty estimate using in-situ data records. The total uncertainty budget is comprised of a number of components, arising from uncorrelated (eg. noise), locally systematic (eg. atmospheric), large scale systematic and sampling effects (for gridded products). The importance of distinguishing these components arises in propagating uncertainty across spatio-temporal scales. We apply the method to SST data retrieved from the Advanced Along Track Scanning Radiometer (AATSR) and validate the results for two different SST retrieval algorithms, both at a per pixel level and for gridded data. We find good agreement between our estimated uncertainties and validation data. This approach to calculating uncertainties in SST retrievals has a wider application to data from other instruments and retrieval of other geophysical variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sea surface temperature (SST) data are often provided as gridded products, typically at resolutions of order 0.05 degrees from satellite observations to reduce data volume at the request of data users and facilitate comparison against other products or models. Sampling uncertainty is introduced in gridded products where the full surface area of the ocean within a grid cell cannot be fully observed because of cloud cover. In this paper we parameterise uncertainties in SST as a function of the percentage of clear-sky pixels available and the SST variability in that subsample. This parameterisation is developed from Advanced Along Track Scanning Radiometer (AATSR) data, but is applicable to all gridded L3U SST products at resolutions of 0.05-0.1 degrees, irrespective of instrument and retrieval algorithm, provided that instrument noise propagated into the SST is accounted for. We also calculate the sampling uncertainty of ~0.04 K in Global Area Coverage (GAC) Advanced Very High Resolution Radiometer (AVHRR) products, using related methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For any number field we calculate the exact proportion of rational numbers which are everywhere locally a norm but not globally a norm from the number field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.