808 resultados para Empirical Predictions
Resumo:
In the mid 1990s the North Atlantic subpolar gyre (SPG) warmed rapidly, with sea surface temperatures (SST) increasing by 1°C in just a few years. By examining initialized hindcasts made with the UK Met Office Decadal Prediction System (DePreSys), it is shown that the warming could have been predicted. Conversely, hindcasts that only consider changes in radiative forcings are not able to capture the rapid warming. Heat budget analysis shows that the success of the DePreSys hindcasts is due to the initialization of anomalously strong northward ocean heat transport. Furthermore, it is found that initializing a strong Atlantic circulation, and in particular a strong Atlantic Meridional Overturning Circulation, is key for successful predictions. Finally, we show that DePreSys is able to predict significant changes in SST and other surface climate variables related to the North Atlantic warming.
Resumo:
We evaluated the accuracy of six watershed models of nitrogen export in streams (kg km2 yr−1) developed for use in large watersheds and representing various empirical and quasi-empirical approaches described in the literature. These models differ in their methods of calibration and have varying levels of spatial resolution and process complexity, which potentially affect the accuracy (bias and precision) of the model predictions of nitrogen export and source contributions to export. Using stream monitoring data and detailed estimates of the natural and cultural sources of nitrogen for 16 watersheds in the northeastern United States (drainage sizes = 475 to 70,000 km2), we assessed the accuracy of the model predictions of total nitrogen and nitrate-nitrogen export. The model validation included the use of an error modeling technique to identify biases caused by model deficiencies in quantifying nitrogen sources and biogeochemical processes affecting the transport of nitrogen in watersheds. Most models predicted stream nitrogen export to within 50% of the measured export in a majority of the watersheds. Prediction errors were negatively correlated with cultivated land area, indicating that the watershed models tended to over predict export in less agricultural and more forested watersheds and under predict in more agricultural basins. The magnitude of these biases differed appreciably among the models. Those models having more detailed descriptions of nitrogen sources, land and water attenuation of nitrogen, and water flow paths were found to have considerably lower bias and higher precision in their predictions of nitrogen export.
Resumo:
1. It has been postulated that climate warming may pose the greatest threat species in the tropics, where ectotherms have evolved more thermal specialist physiologies. Although species could rapidly respond to environmental change through adaptation, little is known about the potential for thermal adaptation, especially in tropical species. 2. In the light of the limited empirical evidence available and predictions from mutation-selection theory, we might expect tropical ectotherms to have limited genetic variance to enable adaptation. However, as a consequence of thermodynamic constraints, we might expect this disadvantage to be at least partially offset by a fitness advantage, that is, the ‘hotter-is-better’ hypothesis. 3. Using an established quantitative genetics model and metabolic scaling relationships, we integrate the consequences of the opposing forces of thermal specialization and thermodynamic constraints on adaptive potential by evaluating extinction risk under climate warming. We conclude that the potential advantage of a higher maximal development rate can in theory more than offset the potential disadvantage of lower genetic variance associated with a thermal specialist strategy. 4. Quantitative estimates of extinction risk are fundamentally very sensitive to estimates of generation time and genetic variance. However, our qualitative conclusion that the relative risk of extinction is likely to be lower for tropical species than for temperate species is robust to assumptions regarding the effects of effective population size, mutation rate and birth rate per capita. 5. With a view to improving ecological forecasts, we use this modelling framework to review the sensitivity of our predictions to the model’s underpinning theoretical assumptions and the empirical basis of macroecological patterns that suggest thermal specialization and fitness increase towards the tropics. We conclude by suggesting priority areas for further empirical research.
Resumo:
Early and effective flood warning is essential to initiate timely measures to reduce loss of life and economic damage. The availability of several global ensemble weather prediction systems through the “THORPEX Interactive Grand Global Ensemble” (TIGGE) archive provides an opportunity to explore new dimensions in early flood forecasting and warning. TIGGE data has been used as meteorological input to the European Flood Alert System (EFAS) for a case study of a flood event in Romania in October 2007. Results illustrate that awareness for this case of flooding could have been raised as early as 8 days before the event and how the subsequent forecasts provide increasing insight into the range of possible flood conditions. This first assessment of one flood event illustrates the potential value of the TIGGE archive and the grand-ensembles approach to raise preparedness and thus to reduce the socio-economic impact of floods.
Resumo:
In order to achieve sustainability it is necessary to balance the interactions between the built and natural environment. Biodiversity plays an important part towards sustainability within the built environment, especially as the construction industry comes under increasing pressure to take ecological concerns into account. Bats constitute an important component of urban biodiversity and several species are now highly dependent on buildings, making them particularly vulnerable to anthropogenic and environmental changes. As many buildings suitable for use as bat roosts age, they often require re-roofing and traditional bituminous roofing felts are frequently being replaced with breathable roofing membranes (BRMs), which are designed to reduce condensation. Whilst the current position of bats is better in many respects than 30 years ago, new building regulations and modern materials, may substantially reduce the viability of existing roosts. At the same time building regulations require that materials be fit for purpose and with anecdotal evidence that both bats and BRMs may experience problems when the two interact, it is important to know what roost characteristics are essential for house dwelling bats and how these and BRMs may be affected. This paper reviews current literature and knowledge and considers the possible ways in which bats and BRMs may interact, how this could affect existing bat roosts within buildings and the implications for BRM service life predictions and warranties. It concludes that in order for the construction and conservation sectors to work together in solving this issue, a set of clear guidelines should be developed for use on a national level.
Resumo:
We employ a large dataset of physical inventory data on 21 different commodities for the period 1993–2011 to empirically analyze the behavior of commodity prices and their volatility as predicted by the theory of storage. We examine two main issues. First, we analyze the relationship between inventory and the shape of the forward curve. Low (high) inventory is associated with forward curves in backwardation (contango), as the theory of storage predicts. Second, we show that price volatility is a decreasing function of inventory for the majority of commodities in our sample. This effect is more pronounced in backwardated markets. Our findings are robust with respect to alternative inventory measures and over the recent commodity price boom.
Resumo:
We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.
Resumo:
The paper reports a study that investigated the relationship between students’ self-predicted and actual General Certificate of Secondary Education results in order to establish the extent of over- and under-prediction and whether this varies by subject and across genders and socio-economic groupings. It also considered the relationship between actual and predicted attainment and attitudes towards going to university. The sample consisted of 109 young people in two schools being followed up from an earlier study. Just over 50% of predictions were accurate and students were much more likely to over-predict than to under-predict. Most errors of prediction were only one grade out and may reflect examination unreliability as well as student misperceptions. Girls were slightly less likely than boys to over-predict but there were no differences associated with social background. Higher levels of attainment, both actual and predicted, were strongly associated with positive attitudes to university. Differences between predictions and results are likely to reflect examination errors as well as pupil errors. There is no evidence that students from more advantaged social backgrounds over-estimate themselves compared with other students, although boys over-estimate themselves compared with girls.
Resumo:
This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.
Resumo:
Although ensemble prediction systems (EPS) are increasingly promoted as the scientific state-of-the-art for operational flood forecasting, the communication, perception, and use of the resulting alerts have received much less attention. Using a variety of qualitative research methods, including direct user feedback at training workshops, participant observation during site visits to 25 forecasting centres across Europe, and in-depth interviews with 69 forecasters, civil protection officials, and policy makers involved in operational flood risk management in 17 European countries, this article discusses the perception, communication, and use of European Flood Alert System (EFAS) alerts in operational flood management. In particular, this article describes how the design of EFAS alerts has evolved in response to user feedback and desires for a hydrographic-like way of visualizing EFAS outputs. It also documents a variety of forecaster perceptions about the value and skill of EFAS forecasts and the best way of using them to inform operational decision making. EFAS flood alerts were generally welcomed by flood forecasters as a sort of ‘pre-alert’ to spur greater internal vigilance. In most cases, however, they did not lead, by themselves, to further preparatory action or to earlier warnings to the public or emergency services. Their hesitancy to act in response to medium-term, probabilistic alerts highlights some wider institutional obstacles to the hopes in the research community that EPS will be readily embraced by operational forecasters and lead to immediate improvements in flood incident management. The EFAS experience offers lessons for other hydrological services seeking to implement EPS operationally for flood forecasting and warning. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
The objective of this book is to present the quantitative techniques that are commonly employed in empirical finance research together with real world, state of the art research examples. Each chapter is written by international experts in their fields. The unique approach is to describe a question or issue in finance and then to demonstrate the methodologies that may be used to solve it. All of the techniques described are used to address real problems rather than being presented for their own sake and the areas of application have been carefully selected so that a broad range of methodological approaches can be covered. This book is aimed primarily at doctoral researchers and academics who are engaged in conducting original empirical research in finance. In addition, the book will be useful to researchers in the financial markets and also advanced Masters-level students who are writing dissertations.
Resumo:
The facilitation of healthier dietary choices by consumers is a key element of government strategies to combat the rising incidence of obesity in developed and developing countries. Public health campaigns to promote healthier eating often target compliance with recommended dietary guidelines for consumption of individual nutrients such as fats and added sugars. This paper examines the association between improved compliance with dietary guidelines for individual nutrients and excess calorie intake, the most proximate determinant of obesity risk. We apply quantile regressions and counterfactual decompositions to cross-sectional data from the National Diet and Nutrition Survey (2000-01) to assess how excess calorie consumption patterns in the UK are likely to change with improved compliance with dietary guidelines. We find that the effects of compliance vary significantly across different quantiles of calorie consumption. Our results show that compliance with dietary guidelines for individual nutrients, even if successfully achieved, is likely to be associated with only modest shifts in excess calorie consumption patterns. Consequently, public health campaigns that target compliance with dietary guidelines for specific nutrients in isolation are unlikely to have a significant effect on the obesity risk faced by the population.
Resumo:
This paper considers how trading volume impacts upon the first three moments of REIT returns. Consistent with previous studies of the broader stock market, we find that volume is a significant factor with respect to both returns and volatility. We also find evidence supportive of the Hong & Stein’s (2003) Investor Heterogeneity Theory with respect to the finding that skewness in REIT index returns is significantly related to volume. Furthermore, we also report findings that show the influence of the variability of volume with skewness.
Resumo:
Brain activity can be measured non-invasively with functional imaging techniques. Each pixel in such an image represents a neural mass of about 105 to 107 neurons. Mean field models (MFMs) approximate their activity by averaging out neural variability while retaining salient underlying features, like neurotransmitter kinetics. However, MFMs incorporating the regional variability, realistic geometry and connectivity of cortex have so far appeared intractable. This lack of biological realism has led to a focus on gross temporal features of the EEG. We address these impediments and showcase a "proof of principle" forward prediction of co-registered EEG/fMRI for a full-size human cortex in a realistic head model with anatomical connectivity, see figure 1. MFMs usually assume homogeneous neural masses, isotropic long-range connectivity and simplistic signal expression to allow rapid computation with partial differential equations. But these approximations are insufficient in particular for the high spatial resolution obtained with fMRI, since different cortical areas vary in their architectonic and dynamical properties, have complex connectivity, and can contribute non-trivially to the measured signal. Our code instead supports the local variation of model parameters and freely chosen connectivity for many thousand triangulation nodes spanning a cortical surface extracted from structural MRI. This allows the introduction of realistic anatomical and physiological parameters for cortical areas and their connectivity, including both intra- and inter-area connections. Proper cortical folding and conduction through a realistic head model is then added to obtain accurate signal expression for a comparison to experimental data. To showcase the synergy of these computational developments, we predict simultaneously EEG and fMRI BOLD responses by adding an established model for neurovascular coupling and convolving "Balloon-Windkessel" hemodynamics. We also incorporate regional connectivity extracted from the CoCoMac database [1]. Importantly, these extensions can be easily adapted according to future insights and data. Furthermore, while our own simulation is based on one specific MFM [2], the computational framework is general and can be applied to models favored by the user. Finally, we provide a brief outlook on improving the integration of multi-modal imaging data through iterative fits of a single underlying MFM in this realistic simulation framework.
Resumo:
The time-mean quasi-geostrophic potential vorticity equation of the atmospheric flow on isobaric surfaces can explicitly include an atmospheric (internal) forcing term of the stationary-eddy flow. In fact, neglecting some non-linear terms in this equation, this forcing can be mathematically expressed as a single function, called Empirical Forcing Function (EFF), which is equal to the material derivative of the time-mean potential vorticity. Furthermore, the EFF can be decomposed as a sum of seven components, each one representing a forcing mechanism of different nature. These mechanisms include diabatic components associated with the radiative forcing, latent heat release and frictional dissipation, and components related to transient eddy transports of heat and momentum. All these factors quantify the role of the transient eddies in forcing the atmospheric circulation. In order to assess the relevance of the EFF in diagnosing large-scale anomalies in the atmospheric circulation, the relationship between the EFF and the occurrence of strong North Atlantic ridges over the Eastern North Atlantic is analyzed, which are often precursors of severe droughts over Western Iberia. For such events, the EFF pattern depicts a clear dipolar structure over the North Atlantic; cyclonic (anticyclonic) forcing of potential vorticity is found upstream (downstream) of the anomalously strong ridges. Results also show that the most significant components are related to the diabatic processes. Lastly, these results highlight the relevance of the EFF in diagnosing large-scale anomalies, also providing some insight into their interaction with different physical mechanisms.